views:

235

answers:

3

Suppose I have a C function:

void myFunction(..., int nObs){
    int myVec[nObs] ;
    ...
}

Is myVec being dynamically allocated? nObs is not constant whenever myFunction is called. I ask because I am currently programming with this habit, and a friend was having errors with his program where the culprit is he didn't dynamically allocate his arrays. I want to know whether my habit of programming (initializing like in the above example) is a safe habit.

Thanks.

+1  A: 

It's compiler-dependent. I know it's ok with gcc, but I don't think the C89 spec allows it. I'm not sure about newer C specs, like C99. Best bet for portability is not to use it.

Eric Warmenhoven
Variable length arrays are part of C99 (http://publib.boulder.ibm.com/infocenter/comphelp/v8v101/index.jsp?topic=/com.ibm.xlcpp8a.doc/language/ref/variable_length_arrays.htm)
R Samuel Klatchko
MSVC knows not about C99. They're focusing on the C++ stuff.
Arthur Kalliokoski
MSVC isn't a C compiler, it's a compiler for a language that closely resembles C as it was *21 years ago*.
Stephen Canon
+3  A: 

To answer your question, it's not considered dynamic allocation because it's in the stack. Before this was allowed, you could on some platforms simulate the same variable length allocation on the stack with a function alloca, but that was not portable. This is (if you program for C99).

Pascal Cuoq
alloca() couldn't do the row-column math for you like a two dimensional array can.
Arthur Kalliokoski
+1  A: 

It is known as a "variable length array". It is dynamic in the sense that its size is determined at run-time and can change from call to call, but it has auto storage class like any other local variable. I'd avoid using the term "dynamic allocation" for this, since it would only serve to confuse.

The term "dynamic allocation" is normally used for memory and objects allocated from the heap and whose lifetime are determined by the programmer (by new/delete, malloc/free), rather than the object's scope. Variable length arrays are allocated and destroyed automatically as they come in and out of scope like any other local variable with auto storage class.

Variable length arrays are not universally supported by compilers; particularly VC++ does not support C99 (and therefore variable length arrays), and there are no plans to do so. Neither does C++ currently support them.

With respect to it being a "safe habit", apart from the portability issue, there is the obvious potential to overflow the stack should nObs be sufficiently large a value. You could to some extent protect against this by making nObs a smaller integer type uint8_t or uint16_t for example, but it is not a very flexible solution, and makes bold assumptions about the size of the stack, and objects being allocated. An assert(nObs < MAX_OBS) might be advisable, but at that point the stack may already have overflowed (this may be OK though since an assert() causes termination in any case).

[edit] Using variable length arrays is probably okay if the size is either not externally determined as in your example. [/edit]

On the whole, the portability and the stack safety issues would suggest that variable length arrays are best avoided IMO.

Clifford
You can overflow the stack with constant-size arrays or even just normal local variables. If you aren't prepared to deal with that sort of concern, you shouldn't be using C.
Stephen Canon
@Stephen Canon: The difference is that they are constant and therefore under 'local control' - the writer of the function applies the restriction. The writer of the function may not be the same as the user of it (it may be a third-party library, legacy code, or a team development for example), and may not may not be aware of how the memory is allocated and the restrictions that imposes. Also in the worst case, the size my be determined by end-user input, which if not suitably bounded might cause trouble. My point is that it has *additional* dangers not presented by constant length array.
Clifford
@Clifford: I agree that there are extra risks. However, the C language itself is an "additional danger"; do you conclude that C is "best avoided", or that it should be used in appropriate circumstances by programmers who know what they're doing?
Stephen Canon
I've changed my codes to use calloc/malloc with free. Now what if there is HEAP overflow?After this discussion, I feel like i shouldn't even use variable declaration in my functions but I should dynamically allocate, even for variables of length 1! =)
Vinh Nguyen
@Vinh: There is no such thing as "heap overflow", though it can simply fail to allocate. If malloc() fails, it returns NULL. If new fails it throws an exception, unless you use new(nothrow), when it returns zero. On most systems heap allocation is ultimately the OS's responsibility, and the behaviour on failure may be different if the OS handles it instead. On a modern Desktop OS you have a number of Gb of virtual memory regardless of available physical memory, so in most instances this is not an issue. Conversely the stack in a typical Windows thread for example is a couple of Mb.
Clifford
Gotcha! So is the recommended way to do this to always check whether the pointer allocated from calloc/malloc is a NULL and then write some kind of handler for it? This seems WAY tedious though.Also, would you recommend to always allocate dynamically via calloc/malloc than to declare variables like "int x;" and "double y;" in functions?
Vinh Nguyen
@Vinh: I am recommending nothing, merely describing the behaviour; you choose. On a modern desktop system the chances of failure are slim by virtue of massive amounts of memory available. I believe on Linux, malloc() never truly returns NULL, the OS will throw and out of memory error and terminate the process. Certainly do not use dynamic memory allocation for small objects of known size. I would not recommend using malloc() at all over using C++ and the `new` operator.
Clifford