views:

325

answers:

1

Is there a way to read a large text file (~60MB) into memory at once (like a compiler flag to increase program memory limit) ? Currently, ofstream's open function throws a segmentation fault while trying to read this file.

ifstream fis;
fis.open("my_large_file.txt"); // Segfaults here

The file just consists of rows of the form

number_1<tabspace>number_2

i.e., two numbers separated by a tabspace.

+4  A: 

You have some other problem, because you aren't reading the file by just calling open. My guess is the file doesn't exist (or doesn't exist in the relative path you think it should exist in).

Blindy
I concur with your "some other problem" comment but neither of those should cause a SEGV.
paxdiablo
The path is valid because it pauses for around 20s trying to open the file, then throws a segfault. I'm able to open smaller files in the same directory with this call.
NoneType
Actually it can segv if it throws an unhandled exception: "depending on the value set with exceptions an exception may be thrown". And because it waits for 20s, I'm going to further guess it's trying to browse the network for some reason (network mapped drive?)
Blindy
If it's a UNIX system, that 20s may be partially spent writing out the core file.
paxdiablo