-
Notifications
You must be signed in to change notification settings - Fork 76
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Feature] In memory LZ4Frame decompression #6
Comments
Isn't |
@MiloszKrajewski do Pickle and Unpickle handle Frames? What I am trying to do is to read and play ROS-Bag files... they're essentially video files, where every frame can be compressed using LZ4 or BZ2... I usually need to decompress between 60 and 90 images per second, so I'm trying to avoid any unnecesary byte copy. I am also using ArrayPool to reuse buffers. |
That's fair. From what I've seen every frame is ROS bag file is a mini LZ4Stream. I guess all the magic is in https://github.com/ros/ros_comm/blob/melodic-devel/utilities/roslz4/src/lz4s.c but it also seems to me like customized lz4 implementation so I'm not exactly sure if it would be easy to implement. I understand |
Exactly. Right now it's what I am using, and it's decoding the LZ4 miniblocks just fine. But in order to improve performance, any memory reallocation is a bottleneck. I am using Spans, ArraySegments and ArrayPools everywhere... I also would like to try with realtime recording/encoding; right now I am recording raw frames at 500mb/second throughput, and it's already at the limit. So my plan is that if realtime encoding is not fast enough, I would do it in a post process pass. |
Currently I don't provide I know it is still not ideal, but much more plausible in short term. |
In version 1.0.2 you can now read content length (if compressed with content length). |
1.3.0-beta has a lot of new streaming methods and some of them are very memory friendly, let's say. |
Could we have an API to encode/decode frame blocks directly to arrays/spans ?
something like this:
The text was updated successfully, but these errors were encountered: