Closed
Description
Tables for levels 1-6 are quite big.
Investigate impact of 1) reducing precision and 2) don't store values along with the offset.
Reducing the precision will lose compression, but probably not too much when reduced by 1 bit.
Loading values may be offset by smaller tables, but it will add several more offset checks and a bounds check.
The impact of each change will reduce the size by half.
Impact of looking up values looks positive:
λ benchcmp before.txt after.txt
benchmark old ns/op new ns/op delta
BenchmarkGzipL1-12 24008287 23164958 -3.51%
BenchmarkGzipL2-12 25604879 25196246 -1.60%
Seems like it could be an overall win, though higher levels may suffer a bit more.
Speed seems rather unaffected by less precision, making it simply less effective by reducing the size.
Metadata
Metadata
Assignees
Labels
No labels