Pretty much all compression algo I know work in block mode, meaning a random seek isn't possible. Even LZMA which doesn't use an initial dictionary requires sequential decompression.
Stream compression means usually an adaptive lossy compression with some key that reset state (or actually cut into blocks). Details are more complex.
Now here are a couple of ideas to solve this:
- Create an index: Like when you open a ZIP you can see all files in it
- Cut your compressed file into blocks and then use a binary search within each block (similar actually to the first one)
- Decompress in memory but actually discard any data until you found the beginning of the data you're looking for.
The last way is good for small compressed files, and the block method is good for larger compressed files. You can mix the two.
PS: Fixed with in the input, doesn't mean the compressed file will be fixed with. So it's a pretty useless info.