New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Issues streamHC branch using new lz4hc streaming API #30
Comments
Hi Beyers
It's still early days for this branch, so I can't promise good results yet. The API passes a lot of simple tests, but that does not mean it cover all situations. For example, I'm struggling currently with a round buffer scenario (tests & debugging ongoing). So if you find a use case generating bad results, please don't hesitate to describe it, or even post a sample code. This is the right time to integrate those use cases within the automated test tool.
Yes
Yes, it's the final objective. The streamHC branch will be merged into dev when it becomes fully compliant with this scenario. Regards |
Hi Yann, My current use case is for network packet compression. For testing purposes I read packets from a PCAP dump, compress the data from the transport layer and up of each packet, decompress the compressed data and create a new pcap dump file. To confirm everything is working as it should I diff the original pcap file with the new pcap file. I'm using a ring buffer similar to your example blockStreaming_ringBuffer.c implementation. Testing with standard lz4, this test works 100%. Changing to lz4hc streaming the new pcap file is not the same size as the original pcap file. Here is a portion of the lz4hc streaming code I'm using, maybe you can spot something obvious I'm doing wrong:
The decompression code I'm assuming is implemented correctly seeing that standard lz4 works perfectly. The HC compression code is also pretty much the same as the lz4 compression code apart from the hc API change. Any idea? |
Yes, I've a first fix ongoing, but it won't solve the entire situation. If you need a quick solution, please consider a "double buffer" methodology, which should work correctly today. |
Awesome, I can certainly refactor my code to use double buffer instead. Will report back if that workaround fixes things for me. |
@Cyan4973 I can confirm that the double buffer does indeed work with lz4hc. The compression ratio dropped significantly though, probably due to our small data packet size. I ended up with a slightly modified double buffer strategy, I process 256 packets per buffer, then copy the dictionary to the other buffer and vice versa. This seems to yield similar compression ratios than the circular buffer strategy and works with lz4hc. Thank you for the assistance! |
You're welcomed. |
OK, with latest round of tests, I believe the new streaming implementation available within LZ4 "Dev" branch is now fully compatible with ring buffers scenario, including ring buffers of different sizes on the compression and decompression side. |
Awesome, will test on my side as well! Thank you @Cyan4973 |
@Cyan4973 First off, great work on this library. We are getting good results using it.
I'm testing the latest HC streaming API on the streamHC branch, but decompression results are invalid. Couple of questions:
The text was updated successfully, but these errors were encountered: