tags:

views:

909

answers:

6

You may think that this is a coincidence that the topic of my question is similar to the name of the forum but I actually got here by googling the term "stack overflow".

I use the OPNET network simulator in which I program using C. I think I am having a problem with big array sizes. It seems that I am hitting some sort of memory allocation limitation. It may have to do with OPNET, Windows, my laptop memory or most likely C language. The problem is caused when I try to use nested arrays with a total number of elements coming to several thousand integers. I think I am exceeding an overall memory allocation limit and I am wondering if there is a way to increase this cap. Here's the exact problem description:

I basically have a routing table. Let's call it routing_tbl[n], meaning I am supporting 30 nodes (routers). Now, for each node in this table, I keep info. about many (hundreds) available paths, in an array called paths[p]. Again, for each path in this array, I keep the list of nodes that belong to it in an array called hops[h]. So, I am using at least nph integers worth of memory but this table contains other information as well. In the same function, I am also using another nested array that consumes almost 40,000 integers as well. As soon as I run my simulation, it quits complaining about stack overflow. It works when I reduce the total size of the routing table. What do you think causes the problem and how can it be solved? Much appreciated Ali

+7  A: 

It may help if you post some code. Edit the question to include the problem function and the error.

Meanwhile, here's a very generic answer:

The two principle causes of a stack overflow are 1) a recursive function, or 2) the allocation of a large number of local variables.

Recursion

if you're function calls itself, like this:

int recurse(int number) {

    return (recurse(number));
}

Since local variables and function arguments are stored on the stakc, then it will in fill the stack and cause a stack overflow.

Large local variables

If you try to allocate a large array of local variables then you can overflow the stack in one easy go. A function like this may cause the issue:

void hugeStack (void) {

    unsigned long long reallyBig[100000000][1000000000];

    ...
}

There is quite a detailed answer to this similar question.

Andrew Johnson
Technically the recursion function would be tail-optimised into a simple infinite loop, but let's not be too pendantic with the example :)
Andrew Johnson
Given the description, it sounds like the problem too much data pushed on the stack rather than recursion. I bet putting variables on the heap (with malloc) will solve the problem.
Jon Ericson
Yeah, that's what I was answered at first, but then I thought he might be traversing his network with some function. I figured I'd just add a basic reference answer and anyone else who wanders this way can get a clue. Who knows, this noob may never come back :)
Andrew Johnson
+1  A: 

Stack overflows can happen in C when the number of embedded recursive calls is too high. Perhaps you are calling a function from itself too many times?

This error may also be due to allocating too much memory in static declarations. You can switch to dynamic allocations through malloc() to fix this type of problem.

Is there a reason why you cannot use the debugger on this program?

tkerwin
Picky, I know, but depths are not "high" they are "deep". Sorry.
Cyberherbalist
+3  A: 

Somehow you are using a lot of stack. Possible causes include that you're creating the routing table on the stack, you're passing it on the stack, or else you're generating lots of calls (eg by recursively processing the whole thing).

In the first two cases you should create it on the heap and pass around a pointer to it. In the third case you'll need to rewrite your algorithm in an iterative form.

How can you pass an array on the stack in C?
Alexander
+1  A: 

It depends on where you have declared the variable.

A local variable (i.e. one declared on the stack is limited by the maximum frame size) This is a limit of the compiler you are using (and can usually be adjusted with compiler flags).

A dynamically allocated object (i.e. one that is on the heap) is limited by the amount of available memory. This is a property of the OS (and can technically by larger the physical memory if you have a smart OS).

Martin York
A: 

You are unlikely to run into a stack overflow with unthreaded compiled C unless you do something particularly egregious like have runaway recursion or a cosmic memory leak. However, your simulator probably has a threading package which will impose stack size limits. When you start a new thread it will allocate a chunk of memory for the stack for that thread. Likely, there is a parameter you can set somewhere that establishes the the default stack size, or there may be a way to grow the stack dynamically. For example, pthreads has a function pthread_attr_setstacksize() which you call prior to starting a new thread to set its size. Your simulator may or may not be using pthreads. Consult your simulator reference documentation.

mxg
+1  A: 

Many operating systems dynamically expand the stack as you use more of it. When you start writing to a memory address that's just beyond the stack, the OS assumes your stack has just grown a bit more and allocates it an extra page (usually 4096Kib on x86 - exactly 1024 ints).

The problem is, on the x86 (and some other architectures) the stack grows downwards but C arrays grow upwards. This means if you access the start of a large array, you'll be accessing memory that's more than a page away from the edge of the stack.

If you initialise your array to 0 starting from the end of the array (that's right, make a for loop to do it), the errors might go away. If they do, this is indeed the problem.

You might be able to find some OS API functions to force stack allocation, or compiler pragmas/flags. I'm not sure about how this can be done portably, except of course for using malloc() and free()!

Artelius