tags:

views:

175

answers:

4

I have a very large file I need to parse, so reading it into memory all at once is non-ideal. The way the file is structured, it would be much, much easier if I could start at eof and go up to the beginning. Does anyone have a good trick for doing this? I'm using Visual Studio 2008 and C++. Thanks

+1  A: 

It's not possible to make the position "decrement" instead of increment after each read/write. This is why you only have EOF, and not a SOF. The only way is to call fseek/seekg()/seekp() after each read or write to undo the position change, but that will be very slow.

Mads Elvheim
+6  A: 

If your operating system supports it, consider using a memory mapped file. You can then treat the file contents as a very large array of bytes, with the operating system managing bringing the data into memory (and releasing it) as necessary.

anon
+1  A: 

Store the file in the reverse order in the first place.

Matt Joiner
If only I could
Steve
A: 

If you intend to do this a lot how about creating a reverse BufferedInputStream class - you would be able to give this some measure of control on how large each chunk of the file is held in the buffer and hide all housekeeping from the client.

What others have said about repositioning after each actual file I/O remains valid - this would only improve usability, not performance.

Gray-M