It is known as a "variable length array". It is dynamic in the sense that its size is determined at run-time and can change from call to call, but it has auto
storage class like any other local variable. I'd avoid using the term "dynamic allocation" for this, since it would only serve to confuse.
The term "dynamic allocation" is normally used for memory and objects allocated from the heap and whose lifetime are determined by the programmer (by new/delete, malloc/free), rather than the object's scope. Variable length arrays are allocated and destroyed automatically as they come in and out of scope like any other local variable with auto
storage class.
Variable length arrays are not universally supported by compilers; particularly VC++ does not support C99 (and therefore variable length arrays), and there are no plans to do so. Neither does C++ currently support them.
With respect to it being a "safe habit", apart from the portability issue, there is the obvious potential to overflow the stack should nObs
be sufficiently large a value. You could to some extent protect against this by making nObs
a smaller integer type uint8_t
or uint16_t
for example, but it is not a very flexible solution, and makes bold assumptions about the size of the stack, and objects being allocated. An assert(nObs < MAX_OBS)
might be advisable, but at that point the stack may already have overflowed (this may be OK though since an assert() causes termination in any case).
[edit]
Using variable length arrays is probably okay if the size is either not externally determined as in your example.
[/edit]
On the whole, the portability and the stack safety issues would suggest that variable length arrays are best avoided IMO.