views:

332

answers:

8

What is the role of the #define directive?

+13  A: 

#define is used to create macros in C and in C++. You can read more about it in the C preprocessor documentation. The quick answer is that it does a few things:

  1. Simple Macros - basically just text replacement. Compile time constants are a good example:

    #define SOME_CONSTANT 12
    

    simply replaces the text SOME_CONSTANT with 12 wherever it appears in your code. This sort of macro is often used to provide conditional compilation of code blocks. For example, there might be a header included by each source file in a project with a list of options for the project:

    #define OPTION_1
    #define OPTION_2
    #undef  OPTION_3
    

    And then code blocks in the project would be wrapped with matching #ifdef/#endif# blocks to enable and disable those options in the finished project. Using the -D gcc flag would provide similar behaviour. There are strong opinions as to whether or not this method is really a good way to provide configuration for an application, however.

  2. Macros with arguments - allows you to make 'function-like' macros that can take arguments and manipulate them. For example:

    #define SQUARE(x)  ((x) * (x))
    

    would return the square of the argument as its result; be careful about potential order-of-operations or side-effect problems! The following example:

    int x = SQUARE(3);     // becomes int x = ((3) * (3));
    

    will works fine, but something like:

    int y = SQUARE(f());   // becomes int y = ((f()) * (f()));
    

    will call f() twice, or even worse:

    int z = SQUARE(x++);   // becomes int z = ((x++) * (x++));
    

    results in undefined behaviour!

    With some tools, macros with arguments can also be variadic, which can come in handy.

As mentioned below in the comments, overuse of macros, or the development of overly complicated or confusing macros is considered bad style by many - as always, put the readability, maintainability, and debuggability of your code above 'clever' technical tricks.

Carl Norum
+1. And since the OP asked about the "role" that `#define` plays, it should be emphasized that macros shouldn't be over-used.
stakx
Variadic macros are standard in C99, so most modern tools should support them.
pkh
+1, but should add an example of how SQUARE() will be *expanded* in a few situations.
greyfade
I'd emphasise using them for configuration (sections of code in `#ifdef`... `#endif`). In C++ it's probably better to use consts for 1), and templates for 2).
AshleysBrain
@pkh: "most modern tools" do not support C99 though. Most compilers support bits and pieces of C99, but that version of the language has never really caught on, so simply saying "it's C99" doesn't automaticaly imply "it's well-supported"
jalf
@pkh: Variadic macros are not in standard C++, although they're in the current committee draft for the next version of the standard (probably, as I understand it, due out next year). Not to mention that jalf is right.
David Thornley
Good comments, guys. I'll edit some of that in now.
Carl Norum
@jalf -- I disagree. "Most modern tools" do support C99, where modern is released in the last 5 years, and probably 10 years too. The sole holdout/exception I'm aware of is Microsoft
Chris Dodd
These are common misuse of macros in my opinion (for C++). This is not what the macro system is designed for in C++ (admittedly it was actually designed for C and the above are common good uses for C). Each of the above examples has a more safe alternative in C++. The real use of macros is conditional compilation for different types of architecture.
Martin York
I agree with Martin and Ashleys, that's exactly what you do **NOT** want to use macros for in C++. Though I would say that restricting macros to conditional compilation is a bit harsh, I have also used Preprocessing programming instead of creating another body of script to generate code for me.
Matthieu M.
@David Thornley: I'm aware they're not standard in C++. However, to my knowledge I haven't used a toolchain where there was a distinction between the preprocessors for the two languages.@jalf: http://stackoverflow.com/questions/2812433
pkh
+7  A: 

#define (and it's opposite, #undef) can be used to set compiler directives which can then be tested against using #ifndef or #ifdef. This allows for custom behaviors to be defined within the source file. It's used commonly to compile for different environments or debug code.

An example:

#define DEBUG



#ifdef DEBUG

//perform debug code

#endif
Eugarps
A: 

in C or C++ #define allows you to create preprocessor Macros.

In the normal C or C++ build process the first thing that happens is that the PreProcessor runs, the preprocessor looks though the source files for preprocessor directives like #define or #include and then performs simple operations with them.

in the case of a #define directive the preprocessor does simple text based substitution.

For example if you had the code

#define PI 3.14159f

float circum = diameter*PI;

the preprocessor would turn it into:

float circum = diameter* 3.14159;

by simply replacing the instances of PI with the corresponding text. This is only the simplest form of a #define statement for more advanced uses check out this article from MSDN

luke
A: 

inCorrectUseOfHashDefine()

{

The role of #define is to baffle people who inherit your code with out of the blue statements like:

foreverandever

because of:

#define foreverandever for(;;)

}

Please favour constants over #define.

It also for setting compiler directives...

David Relihan
Why the -2? Its a valid point and I stick by it.
David Relihan
because this is *not* the role of a #define but a *misuse* thereof.
René Nyffenegger
@Rene OK - I've made my intentions more explicit
David Relihan
@Rene: You are seriously telling me you missed the sarcasm?
Troubadour
@Troubadour: I am not telling anyone I missed the sarcasm, but I don't think that this answer helps anyone who wants to understand #defines. After explaining #defines, it's Ok, to point out the various problems that come with it if not used thoughtfully, though.
René Nyffenegger
+1  A: 

The #define directive has two common uses.

The first one, is control how the compiler will act. To do this, we also need #undef, #ifdef and #ifndef. (and #endif too...)

You can make "compiler logic" this way. A common use is to activate or not a debug portion of the code, like that:

#ifdef DEBUG

//debug code here

#endif

And you would be able to for example compile the debug code, by writing a #define DEBUG

Another use of this logic stuff, is to avoid double includes...

Example, file A, #includes file B and C. But file B also includes C. This likely will result in a compilation error, because "C" exists twice.

The solution is write:

#ifndef C_FILE_INCLUDED
#define C_FILE_INCLUDED

//the contents of header "c" go here.

#endif

The other use of #define, is make macros.

The most simple ones, consist of simple substitutions, like:

#define PI 3.14159265

float perimeter(float radius) {
    return radius*2*PI;
}

or

#define SHOW_ERROR_MESSAGE printf("An serious error happened");

if ( 1 != 1 ) { SHOW_ERROR_MESSAGE }

Then you can also make macros that accept arguments, printf itself usually is a macro, created with a #define in a header file.

But this should not be done, for two reaons: first, the speed os macros, is the same of using inline, and second, we have c++ templates, that allow more control over functions with variable type. So, the only reason to use macros with arguments, is make strange constructs, that will be hard to understand later, like metaprogrammed stuff...

speeder
Wow, it took forever for me to type my post, now it looks that I just copied the posts of two previous guys and mashed them in one :(
speeder
+1  A: 

The most common use (by far) of #define is for include guards:

// header.hh
#ifndef HEADER_HH_
#define HEADER_HH_

namespace pony {
// ...
}

#endif

Another common use of #define is in creating a configuration file, commonly a config.h file, where we #define macros based on various states and conditions. Then, in our code we test these macros with #ifdef, #elif defined() etc. to support different compiles for different situations. This is not as solid as the include-guard idiom and you need to be careful here because if the branching is wrong then you can get very obscure compiler errors, or worse, runtime behavior.

In general, other than for include guards you need to think through (twice, preferably) about the problem, and see if you can use the compiler rather than the preprocessor to solve it. The compiler is just smarter than the preprocessor. Not only that, but the compiler can't possibly confuse the preprocessor, whereas the preprocessor most definitely can confuse and mislead the compiler.

wilhelmtell
A: 

Most things about #defines have been already told, but it's not clear that C++ has better replacements for most of their uses:

  1. #define to define numerical constants can be easily replaced by a const "variable", that, as a #define, doesn't really exist in the compiled executable. AFAIK it can be used in almost all the situations where you could use a #defined numerical constant, including array bounds. The main advantage for me is that such constants are clearly typed, so there's no need to add casts in the macros "just to be sure", and are scoped, so they can be kept in namespaces/classes/functions, without polluting all the application.

 

const int max_array_size=50;
int an_array[max_array_size];
  1. #define to create macros: macros can often be replaced by templates; for example, the dreaded MAX macro

 

#define MAX(a,b)    ((a)<(b)?(b):(a))

, which has several downsides (e.g. repeated arguments evaluation, inevitable inline expansion), can be replaced by the max function

template<typename T> T & max(T & a, T & b)
{
    return a<b?b:a;
}

which can be type-safe (in this version the two arguments are forced to be of the same type), can be expanded inline as well as not (it's compiler decision), evaluates the arguments just once (when it's called), and is scoped. A more detailed explanation can be found here.

Still, macros must still be used for include guards, to create some kind of strange language extensions that expand to more line of code, that have unbalanced parenthesis, etc.

Matteo Italia
+1  A: 

In C++, #define has very narrow, specialized roles:

  • Header guards, described in other answers
  • Interacting with the standard libraries. For instance, #defining WINDOWS_LEAN_AND_MEAN before including windows.h turns off certain often-problematic macros like MAX.
  • Advanced macros involving stringization (ie, macros that print debugging messages) or token-pasting.

You should avoid using #define for the following purposes. The reasons are many; see for instace this FAQ entry.

  • Compile-time constants. Use const instead.
  • Simple macro functions. Use inline functions and templates instead.