tags:

views:

514

answers:

8

I'm wondering about the practical use of #undef in C. I'm working through K&R, and am up to the preprocessor. Most of this was material I (more or less) understood, but something on page 90 (second edition) stuck out at me:

Names may be undefined with #undef, usually to ensure that a routine is really a function, not a macro:

#undef getchar

int getchar(void) { ... }

Is this a common practice to defend against someone #define-ing a macro with the same name as your function? Or is this really more of a sample that wouldn't occur in reality? (EG, no one in his right, wrong nor insane mind should be rewriting getchar(), so it shouldn't come up.) With your own function names, do you feel the need to do this? Does that change if you're developing a library for others to use?

+2  A: 

I only use it when a macro in an #included file is interfering with one of my functions (e.g., it has the same name). Then I #undef the macro so I can use my own function.

mipadi
that's not a maintainable solution. most modern code metrics (readability, simplicity, maintainability) would conclude it's a terrible solution.
Dustin Getz
+1  A: 

Is this a common practice to defend against someone #define-ing a macro with the same name as your function? Or is this really more of a sample that wouldn't occur in reality? (EG, no one in his right, wrong nor insane mind should be rewriting getchar(), so it shouldn't come up.)

A little of both. Good code will not require use of #undef, but there's lots of bad code out there you have to work with. #undef can prove invaluable when somebody pulls a trick like #define bool int.

John Millikin
Your definition of "good code" might not coincide with the next developers. there are (a handful of) legitimate uses of macro trickery.
Dustin Getz
@Dustin: almost every time I've used #undef, it's because somebody didn't properly namespace their macro.
John Millikin
+4  A: 

Because preprocessor #defines are all in one global namespace, it's easy for namespace conflicts to result, especially when using third-party libraries. For example, if you wanted to create a function named OpenFile, it might not compile correctly, because the header file <windows.h> defines the token OpenFile to map to either OpenFileA or OpenFileW (depending on if UNICODE is defined or not). The correct solution is to #undef OpenFile before defining your function.

Adam Rosenfield
the correct solution here is to not clobber windows functionality in a windows environment.
Dustin Getz
+1  A: 

In addition to fixing problems with macros polluting the global namespace, another use of #undef is the situation where a macro might be required to have a different behavior in different places. This is not a realy common scenario, but a couple that come to mind are:

  • the assert macro can have it's definition changed in the middle of a compilation unit for the case where you might want to perform debugging on some portion of your code but not others. In addition to assert itself needing to be #undef'ed to do this, the NDEBUG macro needs to be redefined to reconfigure the desired behavior of assert

  • I've seen a technique used to ensure that globals are defined exactly once by using a macro to declare the variables as extern, but the macro would be redefined to nothing for the single case where the header/declarations are used to define the variables.

Something like (I'm not saying this is necessarily a good technique, just one I've seen in the wild):

/* globals.h */
/* ------------------------------------------------------ */
#undef GLOBAL
#ifdef DEFINE_GLOBALS
#define GLOBAL
#else
#define GLOBAL extern
#endif

GLOBAL int g_x;
GLOBAL char* g_name;
/* ------------------------------------------------------ */



/* globals.c */
/* ------------------------------------------------------ */
#include "some_master_header_that_happens_to_include_globals.h"

/* define the globals here (and only here) using globals.h */
#define DEFINE_GLOBALS
#include "globals.h"

/* ------------------------------------------------------ */
Michael Burr
I've seen it in the wild. I used to use it - 20 years ago. I gave up on it - 15 years ago. (All times approximate!)
Jonathan Leffler
I might add - some of the code I currently look after does it. I haven't worked up the energy to fix it - more pressing issues than that, as ever.
Jonathan Leffler
+5  A: 

Macros are often used to generate bulk of code. It's often a pretty localized usage and it's safe to #undef any helper macros at the end of the particular header in order to avoid name clashes so only the actual generated code gets imported elsewhere and the macros used to generate the code don't.

/Edit: As an example, I've used this to generate structs for me. The following is an excerpt from an actual project:

#define SEQAN_MAKE_PC_PROVIDER(name) \
    struct PcApi##name { \
        some members …
    };

SEQAN_MAKE_PC_PROVIDER(SA)
SEQAN_MAKE_PC_PROVIDER(SSA)
SEQAN_MAKE_PC_PROVIDER(AF)

#undef SEQAN_MAKE_PC_PROVIDER
Konrad Rudolph
+6  A: 

If you read Plauger's "The Standard C Library", you will see that the <stdio.h> header is expected to provide getchar() as a macro, and getc() (with special permission to evaluate the file pointer argument more than once!). However, the implementation is also obliged to provide an actual function that does the same job, primarily so that you can access a function pointer called getchar() or getc() and pass it to other functions.

That is, by doing:

#include <stdio.h>
#undef getchar

extern int some_function(int (*)(void));

int core_function(void)
{
   int c = some_function(getchar);
   return(c);
}

As written, the core_function() is pretty meaningless, but it illustrates the point. You can do the same thing with the isxxxx() macros in <ctype.h> too, for example.

Normally, you don't want to do that - remove the macro definition. But, when you need the real function, you can get hold of it. People who provide libraries can emulate the functionality of the standard C library to good effect.

Jonathan Leffler
An curious bit I just stumbled on - the C standard explicitly permits using `#undef` to gain access to the actual function. However, the C++ standard has a footnote in 17.4.3.1.1 [lib.macro.names] that says, "It is not permissible to remove a library macro definition by using the #undef directive".
Michael Burr
@Mike B: I wonder how many people that catches out? Probably not many, not even hardened C programmers who are partially migrated to C++. I have never had cause to use the facility in C, let alone in C++.
Jonathan Leffler
A: 

If a macro can be def'ed, there must be a facility to undef.

a memory tracker I use defines its own new/delete macros to track file/line information. this macro breaks the SC++L.

#pragma push_macro( "new" )
#undef new
#include <vector>
#pragma pop_macro( "new" )

Regarding your more specific question: namespaces are often emul;ated in C by prefixing library functions with an identifier.

Blindly undefing macros is going to add confusion, reduce maintainability, and may break things that rely on the original behavior. If you were forced, at least use push/pop to preserve the original behavior everywhere else.

Dustin Getz
Which language or compiler supports #pragma push_macro/pop_macro? It is highly non-standard. And I assume SC++L is the "Standard C++ Library"?
Jonathan Leffler
+1  A: 

Although I think Jonathan Leffler gave you the right answer. Here is a very rare case, where I use an #undef. Normally a macro should be reusable inside many functions; that's why you define it at the top of a file or in a header file. But sometimes you have some repetitive code inside a function that can be shortened with a macro.


int foo(int x, int y)
{
#define OUT_OF_RANGE(v, vlower, vupper) \
    if (v < vlower) {v = vlower; goto EXIT;} \
    else if (v > vupper) {v = vupper; goto EXIT;}

    /* do some calcs */
    x += (x + y)/2;
    OUT_OF_RANGE(x, 0, 100);
    y += (x - y)/2;
    OUT_OF_RANGE(y, -10, 50);

    /* do some more calcs and range checks*/
    ...

EXIT:
    /* undefine OUT_OF_RANGE, because we don't need it anymore */
#undef OUT_OF_RANGE
    ...
    return x;
}

To show the reader that this macro is only useful inside of the function, it is undefined at the end. I don't want to encourage anyone to use such hackish macros. But if you have to, #undef them at the end.

quinmars
bool clip(v, lower, upper)look ma, no hacks!
Dustin Getz
I'm sure that function doesn't jump to EXIT :). But seriously I know that is not a perfect example, and that's because in most cases an (inline) function does the job as well.
quinmars