views:

386

answers:

8

Hi,

while using Eclipse on a Mac (2GB RAM) i have encountered the following problem: whenever i try to create an array which exceeds 8384896 bytes, i get segmentation faults. The following program would execute:

#include <stdio.h>

main()
{
    double x[1048112];
    printf("sizeof(x) = %li", sizeof(x));
}

and the output would be (as expected):

sizeof(x) = 8384896

But increasing the number of elements in x or creating additional variables in main() would result in an unexecutable program and segfaults. It looks like I'm hitting some memory limit and I dont understand why this is happening. I'd be really grateful if anyone could explain this to me, or maybe provide some sort of solution to my problem.

+13  A: 

This is a stack overflow due to excessively large stack variables.

If you really need to allocate something that large, you can do so on the heap using malloc:

double *x = malloc(1048112 * sizeof(double));

Note that with this change, sizeof(x) no longer returns the size of the array, it returns the size of double *. If you need to know how large your array is, you'll need to keep track of that on your own.

And, just for completeness, when you are done with the data, you will need to call free, otherwise you'll have one heck of a memory leak:

free(x);
James McNellis
and don't forget to call free at the end!!!
Johannes Rudolph
Well, I kind of hoped he'd know that, but I put it in there anyway.
James McNellis
+2  A: 

Yup, you are hitting a memory limit ... specifically, you are Stack Overflowing. So what should you do ? Allocate memory on the heap ... as so:

double *x=malloc(1048112*sizeof(double));
Aviral Dasgupta
A: 

You can either increase the amount of allowed space on the stack through linker settings, or preferably start using dynamic memory.

AraK
Futzing the linker won't help; it is a soft O/S limit on MacOS X (and the hard upper bound is 64 MB by default - for non-root users).
Jonathan Leffler
+5  A: 

A process on OS X is limited, by default to 8MB stack (try running ulimit -s from the command line).

One option is to try and increase the stack size by using something like ulimit -s 65536. This should affect all new processes ran from the current shell session.

A better option is to allocate the array on the heap:

 double *x = (double*)malloc(9999999)

And when you are finished with the array, don't forget to deallocate it using: free(x)

EDIT: try this reference for information on how to use the linker to increase maximum stack size on OS X. Again, the preferred option is just to allocate large arrays on the heap. Easier and more portable.

nimrodm
Note that the output of 'ulimit -H -s' shows 65536 as the hard upper bound on the stack size; only a root-owned process coud set its limit to a larger value.
Jonathan Leffler
+3  A: 

Other solutions to the problem include using a static array:

 static double x[1234567];

in your function, or using a global variable outside the function. If the global array is declared static it won't be visible outside the file it is compiled from.

Either way, the array will not be renewed each time you call the routine, so you won't get a "fresh start" each time you call it but the same old data as before.

Kinopiko
I would note that this results in his function being non-reentrant.
James McNellis
Thanks for pointing that out. It's an important point.
Kinopiko
And it also consumes a constant 9.8MB of memory regardless of whether the function is running or not.
Crashworks
+2  A: 

The malloc based solution is right, but this one will save you the trouble of having to track memory yourself:

#include <stdio.h>

static double x[1048112];
main()
{

    printf("sizeof(x) = %li", sizeof(x));
}

Variables that are declared static outside of the function body are not allocated from the stack, and their visibility is limited to the file they're defined in.

fvu
A: 

As a side note: In my experience, in some cases the questions of that nature ("why does my huge automatic array cause a crash") are rooted in the beginners' misunderstanding of the physical nature of an array object. More than once I encountered people that firmly believed that an array physically consists of a "pointer" that points to an additional block of memory for the actual elements. They also believed that an automatic array occupies only a tiny amount of stack memory (for the alleged "pointer"), while the large element block is allocated elsewhere, not in the stack. This might be one of the possible reasons why people (even those who are perfectly aware of the "limitedness" of stack space) see no issues with defining large automatic arrays.

AndreyT
A: 

Just to you to know this limit in windows is 1 mb

this code work


    void myfunction()
    {
    static char yes[1100000]//allocated in the heap
    }

this code don´t work


    void myfunction()
    {
    char yes[1100000]//allocated in the stack
    }

Arabcoder