views:

1857

answers:

5

A few weeks back I was using std::ifstream to read in some files and it was failing immediately on open because the file was larger than 4GB. At the time I couldnt find a decent answer as to why it was limited to 32 bit files sizes, so I wrote my own using native OS API.

So, my question then: Is there a way to handle files greater than 4GB in size using std::ifstream/std::ostream (IE: standard c++)

EDIT: Using the STL implementation from the VC 9 compiler (Visual Studio 2008). EDIT2: Surely there has to be standard way to support file sizes larger than 4GB.

A: 

Which implementation were you using? I'm pretty sure this is not a general limitation - I have used iostreams to read >30GB files.

Are you compiling for 64-bit?

Alastair
No, I'm running a 32 bit OS. Regardless though, a 32 bit OS is able to have greater than 4GB of disk space...
Raindog
+11  A: 

Apparently it depends on how off_t is implemented by the library.

#include <streambuf>
__int64_t temp=std::numeric_limits<std::streamsize>::max();

gives you what the current max is.

STLport supports larger files.

eed3si9n
+3  A: 

I ran into this problem several years ago using gcc on Linux. The OS supported large files, and the C library (fopen, etc) supported it, but the C++ standard library didn't. I turned out that I had to recompile the C++ standard library using the correct compiler flags.

KeithB
I'm with you KeithB, this has _NOTHING_ todo with any problem of MSVC/STL or other, it's a system deployment issue. The source code for the CRT/STL libraries are included with the compiler for a reason, you may need to #define the appropiate preprosessor macro to get your desired functionality. It's NOT that the library implment's one or the other (e.g. *nix using fgetpos or fgetpos64 for >32bit files sized), the library supports BOTH, it's up to the developer to properly use the library. Using STLPort is the same as a recompile of the MSVC STL anyhow, both involve a bit of manual tweaks
RandomNickName42
+1  A: 

From the Standard point of view, there is nothing that prevents this. However, in reality, most 32bit implementations use 32bit for std::size_t. Now, the C++ standard mandates that the standard allocator in the C++ Standard Library uses std::size_t as the size quantity. Thus, you are limited to 2^32 bytes of storage for containers, strings and stuff. The situation could be another for std::off_t, i don't know exactly what is going on there.

You have to use the native API of the OS directly, or some library wrapping it, to be able to do that, without having to trust the Standard Library implementations, which are largely implementation-dependent.

Johannes Schaub - litb
I ended up wrapping the OS API directly for the case where my file size was larger than 4gb
Raindog
A: 

If you can move yourself away from using only standard C++, then you might be interested in boost::iostreams.

Peter Jansson