tags:

views:

77

answers:

3

Hi,

I am writing a C program. In the first lines, I have

typedef float m_elem[NMAX][NMAX][3];
m_elem asa_m;   
m_elem asa_mi[100];

then, some calculations. At the moment, and for each run and depending on the input, I change on the code the NMAX value, and then recompile it and run it. For NMAX values below 500, the program runs ok, but for higher NMAX values (which I need for some input files), all I get is segmentation fault.

What do you recommend me to do here? I read about learning valgrind, but in this case I wonder if just changing compilation options somehow for allowing the program to deal with bigger matrices would help

Thanks

+6  A: 

You're probably running out of stack space.

At NMAX=500, the asa_mi variable will need 500 * 500 * 3 * 100 * 4 bytes, or about 300 MB. Most operating systems won't allow a stack that large, so you might want to check into your system's limits.

Did you try allocating it from the heap, with malloc() instead?

unwind
static allocation is another option.
Philip Potter
+1  A: 

My guess is that you run out of stack memory. I recommend allocating memory dynamically when dealing with large amounts of data

sewa
A: 

Like other have said this may be a stack issue. If array is a local variable for some function then this is likely the case. This will fail because either the CPU's ability to index from the stack pointer is limited or the stack allocation is failing in a similar manner to a stack overflow brought on by infinite recursion. The latter would cause a segmentation fault because either the system has an upper limit on your already defined stack space or because it refuses to allocate anymore for it automatically for some reason. This may differ on different OSes. The former would only cause a segmentation fault if the CPU's stack indexing were not just limited but broken, causing STACKPOINTER-offset to yield the wrong addess; I don't know of any CPUs that have this problem, but it's possible. Consider using either static, global, or dynamic (malloc) arrays, which should avoid this problem.

This could also be a problem related to inconsistent use of the NMAX constant. You may have let some hard coded value for NMAX slip in there, and when you use other (larger) values for that you end up overrunning this.

Similarly, you could just have an array indexing overflow that only causes a segmentation fault if/when NMAX is large. Since NMAX is used as 2 dimension sizes of the m_elem array it's growth is exponential with respect to NMAX. For smaller values of NMAX it could all reside within one memory page, and since OSes typically keep up with a program's memory by pages an overflow may not cause a segmentation fault. As NMAX gets bigger it may require more memory pages to contain it. AS the array grows (exponentially) your chance of an index over bounds error reaching into a memory page that does not contain any part of the array and likely not part of your program at all (or at least not a writable part). If you imagine your 3D array as a castle wall of height 3, lenght NMAX, and width NMAX, and one wall of that castle is weak and prone to being breached, then each time you increased NMAX you just made all of the walls longer, including that one faulty wall, so there is more faulty wall in your castle.

nategoose
thanks for the detailed and insightful answer
Werner