tags:

views:

632

answers:

10

In C, shall I prefer constants over defines? I've reading a lot of code lately, and all of the examples make heavy use of defines.

+6  A: 

Constants should be preferred over defines. There are several advantages:

  • Type safety. While C is a weakly typed languaged, using a define loses all of the type safety, which will allow the compiler to pick up problems for you.

  • Ease of debugging. You can change the value of constants through the debugger, while defines are automatically changed in the code by the pre-processor to the actual value, meaning that if you want to change the value for test/debugging purposes, you need to re-compile.

LeopardSkinPillBoxHat
I've found that many compilers will bake `const static ints` into immediate opcode constants, just like #defines, making them impossible to change in the debugger as well.
Crashworks
@Crashworks - would that depend on the optimisation level?
LeopardSkinPillBoxHat
even with compiler optimization disabled?
f4
I suppose it might; I guess I'm too used to debugging full-optimization builds.
Crashworks
+1  A: 

If it's something that isn't determined programmatically, I use #define. For example, if I want all of my UI objects to have the same space between them, I might use #define kGUISpace 20.

David Kanarek
+2  A: 

Defines have been part of the language longer than constants, so a lot of older code will use them because defines where the only way to get the job done when the code was written. For more recent code it may be simply a matter of programmer habit.

Constants have a type as well as a value, so they would be preferred when it makes sense for your value to have a type, but not when it is typeless (or polymorphic).

John Knoeller
+5  A: 

Maybe I have been using them wrong but, at least in gcc, you can't use constants in case statements.

const int A=12;
switch (argc) {
    case A:
    break;
}
jbcreix
+3  A: 

define can be used for many purposes(very loose) and should be avoided if you can substitute that with const, which define a variable and you can do a lot more with it.

In cases like below, define has to be used

  • directive switch
  • substitution to your source line
  • code macros

An example where you have to use define over const is when you have version number say 3 and you want version 4 to include some methods that is not available in version 3

#define VERSION 4
...

#if VERSION==4
   ................
#endif 
Fadrian Sudaman
+42  A: 

No, in general you should not use const-qualified objects in C to create names constants. In order to create a named constant in C you should use either macros (#define) or enums. In fact, C language has no constants, in the sense that you seem to imply. (C is significantly different from C++ in this regard)

In C language the notions of constant and constant expression are defined very differently from C++. In C constant means a literal value, like 123. Here are some examples of constants in C

123
34.58
'x'

Constants in C can be used to build constant expressions. However, since const-qualified objects of any type are not a constants in C, they cannot be used in constant expressions, and, consequently, you cannot use const-qualified objects where constant expressions are required.

For example, the following is not a constant

const int C = 123; /* C is not a constant!!! */

and since the above C is not a constant, it cannot be used to declare an array type in file scope

typedef int TArray[C]; /* ERROR: constant expression required */

and it cannot be used as a case label

switch (i) {
  case C: ; /* ERROR: constant expression required */
}

i.e it cannot be used anywhere a constant is required.

This might seem counter-intuitive, but this is how C the language is defined.

This is why you see these numerous #define-s in the code you are working with. Again, in C language const-qualified object have very limited use. They are basically completely useless as "constants", which is why in C language you are basically forced to use #define or enums to declare true constants.

Of course, in situations when a const-qualified object works for you, i.e. it does what you want it to do, it is indeed superior to macros in many ways, since it is scoped and typed. You should probably prefer such objects where applicable, however in general case you'll have to take into account the above.

AndreyT
+1 for understanding and properly articulating C/C++ differences, a point which seems to be lost on many people.
asveikau
Nice answer. Thanks for highlighting the issue with `const`-qualified variables for declaring array sizes.
Craig McQueen
+1 Wow, great answer
Helper Method
I'm not quite sure, but as far as I know, in C99 you can use variables to declare an array type (variable-length arrays, aka VLAs)
Helper Method
@Helper Method: You can. But that still makes them VLAs (with 'V' standing for "variable"), which makes them quite different from ordinary arrays. They are only allowed where VLAs are allowed. The above typedef will not compile even in C99. An array with with static storage duration will not compile with such a "constant" as size. A struct member array will not compile either. And so on.
AndreyT
+1: An excellent, well explained answer. I like to think of `const` as meaning "read-only" rather than "constant". If it were a true constant, there would be no sense to a `volatile const`, but a status register on a microcontroller is usually `volatile const` as it cannot be altered in software but can change between instructions. `const` has its place, but it's a different place to that of `#define`.
Al
You may want to change your wording to say *integer* constant expression instead of merely saying "constant expression". An *address* constant expression can very well contain the name of an object with static storage duration. Judging by the usual level of precision of your answers, you may also want to mention that integer constant expressions can contain their name when they are nested within `sizeof`, too.
Johannes Schaub - litb
+1 to 42, great answer btw :)
Joe
+3  A: 

A lot of people here are giving you "C++ style" advice. Some even say the C++ arguments apply to C. That may be a fair point. (Whether it is or not feels kind of subjective.) The people who say const sometimes means something different in the two languages are also correct.

But these are mostly minor points and personally, I think in truth there is relatively minor consequence to going either way. It's a matter of style, and I think different groups of people will give you different answers.

In terms of common usage, historical usage, and most common style, in C, it's a lot more typical to see #define. Using C++isms in C code can come off as strange to a certain narrow segment of C coders. (Including me, so that's where my biases lie.)

But I'm surprised no one has suggested a middle ground solution, that "feels right" in both languages: if it fits into a group of integer constants, use an enum.

asveikau
+1  A: 

Apart from the excellent reasons given by AndreyT for using DEFINES rather than constants in "C" code there is another more pragmatic reason for using DEFINES.

DEFINES are easy define and use from (.h) header files, which is where any experienced C coder would expect to find constants defined. Defining consts in header files is not quite so easy -- its more code to avoid duplicate definitions etc.

Also the "typesafe" arguments are moot most compilers will pick up glaring errors suchh as assing a string to and int, or, will "do the right thing" on a slight mismatch such as assigning an integer to a float.

James Anderson
+1  A: 

Macros (defines) can be used by the pre-processor and at compile time, constants cannot.

You can do compile-time checks to make sure a macro is within a valid range (and #error or #fatal if it isn't). You can use default values for a macro if it hasn't already been defined. You can use a macro in the size of an array.

A compiler can optimize with macros better than it can with constants:

const int SIZE_A = 15;
#define SIZE_B 15

for (i = 0; i < SIZE_A + 1; ++i);    // if not optimized may load A and add 1 on each pass
for (i = 0; i < SIZE_B + 1; ++i);    // compiler will replace "SIZE_B + 1" with 16

Most of my work is with embedded processors that don't have amazing optimizing compilers. Maybe gcc will treat SIZE_A like a macro at some optimization level.

tomlogic
+3  A: 

Though this question is specific to C, I guess it is good to know this:

#include<stdio.h>
int main() {
    const int CON = 123;
    int* A = &CON;
    (*A)++;
    printf("%d\n", CON);  // 124 in C
}

works in C, but not in C++

One of the reasons to use #define is to avoid such things to mess up your code, specially it is a mix of C and C++.

Lazer
+1 Nice tip, didn't know that.
Helper Method