tags:

views:

408

answers:

6

I wonder if typedef and #define the same in c?

+16  A: 

No.

#define is a preprocessor token: the compiler itself will never see it.
typedef is a compiler token: the preprocessor does not care about it.

You can use one or the other to achieve the same effect, but it's better to use the proper one for your needs

#define MY_TYPE int
typedef int My_Type;

When things get "hairy", using the proper tool makes it right

#define FX_TYPE void (*)(int)
typedef void (*stdfx)(int);

void fx_typ(stdfx fx); /* ok */
void fx_def(FX_TYPE fx); /* error */
pmg
+1  A: 

AFAIK, No.

'typedef' helps you setup a "alias" to an existing data type. For eg. typedef char chr;

#define is a preprocessor directive used to define macros or general pattern subsitutions. For eg. #define MAX 100, substitutes all occurences of MAX with 100

Amit
+1  A: 

No.
typedef is a C keyword that creates an alias for a type.
#define is a pre-processor instruction, that creates a text replacement event prior to compilation. When the compiler gets to the code, the original "#defined" word is no longer there. #define is mostly used for macros and global constants.

Traveling Tech Guy
The usage of the term "pointer" could lead to some confusion here.
Amit
Agreed. That's why I went back and added a link to typdef on MSDN - just in case anyone in the future will use this question to find what typedef is. But maybe I should change that word...
Traveling Tech Guy
+18  A: 

typedef obeys scoping rules just like variables, whereas define stays valid until the end of the file (or until a matching undef).

Also, some things can be done with typedef that cannot be done with define.

Examples:

typedef int* int_p1;
int_p1 a, b, c;  // a, b, and c are all int pointers.

#define int_p2 int*
int_p2 a, b, c;  // only the first is a pointer!

.

typedef int a10[10];
a10 a, b, c; // create three 10-int arrays

.

typedef int (*func_p) (int);
func_p fp // func_p is a pointer to a function that
          // takes an int and returns an int
Andreas Grech
Very good examples.
Pascal Cuoq
In the #define example, only the first variable is a pointer because of the way #define works. It's just a "dumb" preprocessor that substitutes one thing for another. In that example, `int_p2` is just replaced with `int*`, producing: `int* a, b, c`, which should be read as `int *a, b, c`.
RFelix
@RFelix, thanks for the explaining what I left out!
Andreas Grech
A: 

No, they are not the same. For example:

#define INTPTR int*
...
INTPTR a, b;

After preprocessing, that line expands to

int* a, b;

Hopefully you see the problem; only a will have the type int *; b will be declared a plain int (because the * is associated with the declarator, not the type specifier).

Contrast that with

typedef int *INTPTR;
...
INTPTR a, b;

In this case, both a and b will have type int *.

There are whole classes of typedefs that cannot be emulated with a preprocessor macro, such as pointers to functions or arrays:

typedef int (*CALLBACK)(void);
typedef int *(*(*OBNOXIOUSFUNC)(void))[20]; 
...
CALLBACK aCallbackFunc;        // aCallbackFunc is a pointer to a function 
                               // returning int
OBNOXIOUSFUNC anObnoxiousFunc; // anObnoxiousFunc is a pointer to a function
                               // returning a pointer to a 20-element array
                               // of pointers to int

Try doing that with a preprocessor macro.

John Bode
A: 

They are very different, although they are often used to implement custom data types (which is what I am assuming this question is all about).

As pmg mentioned, #define is handled by the pre-processor (like a cut-and-paste operation) before the compiler sees the code, and typedef is interpreted by the compiler.

One of the main differences (at least when it comes to defining data types) is that typedef allows for more specific type checking. For example,

#define defType int
typedef int tdType

defType x;
tdType y;

Here, the compiler sees variable x as an int, but variable y as a data type called 'tdType' that happens to be the same size as an int. If you wrote a function that took a parameter of type defType, the caller could pass a normal int and the compiler wouldn't know the difference. If the function instead took a parameter of type tdType, the compiler would ensure that a variable of the proper type was used during function calls.

Also, some debuggers have the ability to handle typedefs, which can be much more useful than having all custom types listed as their underlying primitive types (as it would be if #define was used instead).

bta