views:

195

answers:

1

In my MyConstants.h file... I have:

int abc[3];

In my matching MyConstants.m file... I have:

extern int abc[3] = {11, 22, 33};

In each of my other *.m files... I have

#import "MyConstants.h"

Inside 1 of my viewDidLoad{} methods, I have:

extern int abc[];
NSLog(@"abc = (%d) (%d)", abc[1], sizeof(abc)/sizeof(int));  

Why does it display "abc = (0) (3)" instead of "abc = (22) (3)"?

How do I make this work as expected?

+4  A: 

The extern needs to be in the declaration in the header, not in the definition in the source file. extern tells the compiler that the symbol exists somewhere else, it may or may not be in the same translation unit. It is the linker's job to make sure that all declared symbols were actually defined.

Constants Header (MyConstants.h):

extern int abc[3];

Constants Source (MyConstants.m):

int abc[3] = {11, 22, 33};

Other Source (SomeFile.m):

#include "MyConstants.h"
...
- (void) someMethod
{
    NSLog (@"abc = (%d) (%d)", abc[1], sizeof(abc)/sizeof(int));
}

Also, note that when measuring the size of an array, it is less error-prone to divide by the size of the first element, so that if the type of abc changes (i.e. from int to double), the results are still valid.

- (void) someMethod
{
    NSLog(@"abc = (%d) (%d)", abc[1], sizeof(abc)/sizeof(abc[0]));
}
dreamlax