tags:

views:

80

answers:

2

(Question updated after first comment)

int max_size = 20;
int h[max_size];

Debugging gives a value of [-1] for h when using max_size to initialize;

If instead I initialize using an integer. So the code is:int h[20] , it works fine.

This was with GCC 4.2 on Mac OS X 10.6.

A: 

If I recall correctly, some compilers need to know the size of stack-allocated arrays explicitly at compile-time. This (possibly) being the case, you could make your max_size variable const or an #define macro (or an integer literal, as you've done.) Alternatively, you could dynamically allocate the array and then the size could be any-old variable

ex:

int *array = calloc(max_size, sizeof(int));
Sam
This is not the case for GCC, and when it is the case, a compiler that doesn't support that feature will normally flag the declaration as an error rather than silently compiling nonsense.
Chuck
Ah, that shows some ignorance on my part. Thanks for the correction.
Sam
+2  A: 

I just compiled and ran the following program incorporating your code:

#import <Foundation/Foundation.h>

int main() {
    int max_size = 20;
    int h[max_size];

    h[0] = 5;
    NSLog(@"It is %d", h[0]);

    return 0;
}

It worked fine. The problem is something besides simply declaring an array.

This was with GCC 4.0.1 on Mac OS X 10.4.

Chuck
My complier is GCC 4.2 on Mac OS X 10.6. And if I change it to LLVM GCC 4.2, then it works fine, but I don't see the variable in my debugger window. I don't even know what LLVM is. There must be an alternate solution.
stone