views:

103

answers:

5

If a value is defined as

#define M_40 40

Is the size the same as a short (2 bytes) or is it as a char (1 byte) or int (4 bytes)?

Is the size dependent on whether you are 32-bit or 64-bit?

+11  A: 

#define has no size as it's not a type but a plain text substitution into your C++ code. #define is a preprocessing directive and it runs before your code even begins to be compiled .

The size in C++ code after substitution is whatever the size is of what C++ expression or code you have there. For example if you suffix with L like 102L then it is seen a long, otherwise with no suffix, just an int. So 4 bytes on x86 and x64 probably, but this is compiler dependent.

Perhaps the C++ standard's Integer literal section will clear it up for you (Section 2.13.1-2 of the C++03 standard):

The type of an integer literal depends on its form, value, and suffix. If it is decimal and has no suffix, it has the first of these types in which its value can be represented: int, long int; if the value cannot be represented as a long int, the behavior is undefined. If it is octal or hexadecimal and has no suffix, it has the first of these types in which its value can be represented: int, unsigned int, long int, unsigned long int. If it is suffixed by u or U, its type is the first of these types in which its value can be represented: unsigned int, unsigned long int. If it is suffixed by l or L, its type is the first of these types in which its value can be represented: long int, unsigned long int. If it is suffixed by ul, lu, uL, Lu, Ul, lU, UL, or LU, its type is unsigned long int

Brian R. Bondy
Wish I could +1 once more for the standard quote, very helpful.
GMan
+6  A: 

A plain integer is going to be implicitly cast to int in all calculations and assignments.

#define simply tells the preprocessor to replace all references to a symbol with something else. This is the same as doing a global find-replace on your code and replacing M_40 with 40.

Adam Shiemke
+1 Global find-replace is the clearest, most newbie-friendly explanation
Bart van Heukelom
+2  A: 

A #define value has no size, specifically. It's just text substitution. It depends on the context of where (and what) is being substituted.

In your example, where you use M_40, the compile will see 40, and usually treat it as in int.

However, if we had:

void SomeFunc(long);

SomeFunc(M_40);

It will be treated as a long.

James Curran
Not usually, always. And the constant `40` will still be an `int` in your example code. However, the value `40` will be promoted to a `long` upon calling the function. But `40` the constant is still an `int`.
GMan
@GMan: Why is it still an int even though it could be a simple 1-byte char? Edit (Yes I know char isn't necessarily 1-byte)
0A0D
@GMan: #define L(X) X##L L(M_40) now it's a long. #define S(X) #X S(M_40) now it's a char*, which is why I said "usually".
James Curran
@Changeling: It's just how the language is defined, it's in §2.13/1 in the standard. Without a suffix, the type of the literal will be an `int`; if it cannot fit, it will be a `long int`. If it cannot fit into either of those, the behavior is undefined. Also, a `char` is always 1 byte. :) (A byte might not necessarily be 8 bytes, though. Generally, people use "octect" to refer to a collection of 8 bytes.)
GMan
@GMam: also, the promotion here happens at compile time, making the difference between "an int promoted to a long" and "be[ing] treated as a long" a very grey area.
James Curran
@James: `40` is a literal that has the type `int`, always and forever. When calling the function, the parameter will be initialized as `long(40)`. In that context, `40` is still a literal with the type `int`. (And the entire expression yields a `long`, promoted from an `int` value.) And in response to your alternative macro, that's like: "Is this an apple?" "Yes" "*squishes it* No it's not, it's apple juice!". Yes `40L` is no longer an `int`...nor is it `40` anymore.
GMan
+1  A: 

The preprocessor just does simple text substitution, so the fact that your constant is in a #define doesn't matter. All the C standard says is that "Each constant shall have a type and the value of a constant shall be in the range of representable values for its type." C++ is likely to not vary too much from that.

Carl Norum
@Carl: Going further, what makes C++ cast the value into an `int` instead of char when 40 is an acceptable value for example, unsigned char since it is between 0 and 255?
0A0D
@Changeling: Based on the C++ standard rules.
Brian R. Bondy
+1  A: 

Preprocessor macros get literally swapped in during the preprocess stage of the compilation.

For example the code

#define N 5

int value = N;

will get swapped for

int value = 5;

when the compiler sees it. It does not have a size of its own as such

Simon Walker