views:

225

answers:

5

I've written an method that does some graphics calculations. There, you can specify a start-direction like "from left", "from right", "from bottom", "from top".

Now I don't want the user of my method to pass confusing values like 1, 2, 3 or 4 or even strings. Nothing like that. Instead, I would like to create constants like:

kFromLeft, kFromRight, kFromTop, kFromBottom

I've seen this in an Apple header file:

enum CGImageAlphaInfo {
    kCGImageAlphaNone,              
    kCGImageAlphaPremultipliedLast,  
    kCGImageAlphaPremultipliedFirst, 
    kCGImageAlphaLast,               
    kCGImageAlphaFirst,              
    kCGImageAlphaNoneSkipLast,       
    kCGImageAlphaNoneSkipFirst,     
    kCGImageAlphaOnly                
};
typedef enum CGImageAlphaInfo CGImageAlphaInfo;

Five things I dont understand here / which are unclear to me:

1) Why is there a semicolon separating the definition from the typedef?

2) Why do they repeat CGImageAlphaInfo like a parot?

3) If I would put something like this in my header file, I would say in my method that the type of the parameter is CGImageAlphaInfo (of course I'll have a different name), right?

4) I would normally specify the values for those constants in a way like that? (example):

#define kCGImageAlphaNone 100
#define kCGImageAlphaPremultipliedLast 300
#define kCGImageAlphaPremultipliedFirst 900

5) Am I required to set those constants to such stupid values? Or could I just check inside my method which constant got passed in, like

if(paramConst == kCGImageAlphaNone) {...}

?

+1  A: 

enum stands for Enumerated type.

This is a normal type-declaration with an in-line definition, it having a semi-colon at the end is syntactically correct.

No idea about the repeat, someone more familiar with this choice for Objective-C can answer this, probably.

The type should indeed be CGImageAlphaInfo.

Well, normally is relative, in this case, using an enum for this is pretty normal.

In Java you would do this with checking for the parameters equality to the enum-symbol.

Stefan Thyberg
+3  A: 

1) A semicolon is needed to terminate the enum definition.

2) CGImageAlphaInfo is the name of the enum and the name of the defined type.

3) Right.

4) Using #define for constants is often considered to be archaic C programming style. Declaring constants in an enum gives static analyzers a more information than preprocessor macros do.

5) You should use the symbols, not numeric literals.

Kristopher Johnson
Thanks. So I think that I'll not assign manually values to them in my implementation file. I'm lazy ;)
Thanks
+6  A: 

1) A semicolon always terminates an enum statement. In this case there are two separate statements: one defines a named enumeration, the next defines a new type.

2) The enum statement creates a new type called "enum CGImageAlphaInfo". But typing this everywhere is cumbersome, so the typedef statement is used. The typedef statement works like this:

typedef <sometype> <newname>;

So enum CGImageAlphaInfo is the old type, and CGImageAlphaInfo is the new name. Apple uses the same name for both, which is a bit confusing but is really the best way to go about it.

3) Right.

4) You can do this, but then you have to manually assign the constant values; with enum values are assigned automatically. The main benefit, though, is that you get some type checking since you can use the CGImageAlphaInfo type instead of just a plain int which could be more easily assigned invalid values.

5) I'm not sure what you mean by "stupid values". But yes, you should always check using the name in the way you describe, and never use some raw value like "300" or "1".

Adam Ernst
Thanks. Yes that's one of my goals to achieve, so that the user of the method will not pass in something that makes no sense to my method ;)
Thanks
Great. Keep in mind they still can, by casting a "stupid" value as a CGImageAlphaInfo. But in general it's harder.
Adam Ernst
+1  A: 

The things that are unclear to you:

1) Because the declaration of the enum is separate from the declaration of the type. In the sample code, the programmer first declares the enum, then declares the new type to use that enum.

2) It might be easier if you look at it with (syntactically incorrect) quotes: typedef "enum CGImageAlphaInfo" CGImageAlphaInfo. You're defining the type CGImageAlphaInfo to be the same as the enum.

3) Correct.

4) Your #define method would work fine. Basically, enums are just another way of doing that kind of constant definition, but they're done by your compilation suite rather than having you pick constants to assign to each name. Each value in an enum is guaranteed to be distinct from the rest without you having to go through and make sure. They're also error-checked as necessary.

Tim
Thanks. So there is no need to assign manually random values to each constant in my implementation file? That's cool.
Thanks
+3  A: 

Using an enum rather than a pre-processor define is the best way to handle this. For example,

typedef enum
{
    FromTop = 0,
    FromBottom = 1,
    FromLeft = 2,
    FromRight = 3
} GraphicsLocation;

Put this in a Constants.h file and import it where needed. You actually don't need to include the =1, =2..., although it's useful if you need to specify the actual numeric value to use elsewhere (for example if you're setting the tag attribute of a menu item in Interface Builder).

I've always declared enums like this, rather than the Apple example you posted above. Maybe someone more familiar with c can leave a comment explaining the difference.

Marc Charbonneau
Typing 'enum { /* values */ };' results in a type of 'anonymous enum' being assigned to the constants within the enum, which can cause problems if two different enums declare different values for 'FromTop'. Each is scoped to their own enum, but you can't tell which you're seeing. The other difference (using a separate typedef statement) gives a clearer definition, but also allows for easier changes later if you need to specify the size of the 'GraphicsLocation' value; for instance, see typed enumerations in Foundation in OS X 10.5: they now typedef using NSInteger to get word-sized storage.
Jim Dovey