views:

77

answers:

1

Hello All,

I have inherited some code that I need to maintain that can be less than stable at times. The previous people are no longer available to query as to why they ran the application in an environment with unlimited stack set, I am curious what the effects of this could be? The application seems to have some unpredictable memory bugs that we cannot find and running the application under valgrind is not an option because it slows the application down so much that we cannot actually run it. So any thoughts on what the effects of this might be are appreciated.

Thank you.

+2  A: 

If this is a single threaded standard type of program, limiting the stack size is really just a safety precaution. It will prevent an infinite recursion from eating all your memory before it dies. By setting the limit to unlimited you will just be able to keep allocating on the stack until it tramples over the heap.

In classic Unix fashion the heap and the stack start from opposite sides of the memory space and allocate towards each other, that is one grows up while the other grows down. When they hit you will not get an error you will just overwrite data until something bad happens.

Usually, you don't need a big stack, but allocating large objects on the stack or deep recursion can be an issue for some programs, they then need a larger stack.

Edit: Just to add to the point about being single threaded. In multi-threaded programs you need to allocate more than one stack. That kind of messes up the grow from both ends toward the middle approach. In that case Stacks are allocated in max-stack-size-ish chunks from the stack side of the memory space. Then when you blow your stack you are trampling on another thread's stack. Depending on your architecture you might be able to add some page protection in there to limit this but that is probably TMI at this point ;-)

Ukko