views:

158

answers:

2

Linux's stddef.h defines offsetof() as:

#define offsetof(TYPE, MEMBER) ((size_t) &((TYPE *)0)->MEMBER)

whereas the Wikipedia article on offsetof() (http://en.wikipedia.org/wiki/Offsetof) defines it as:

#define offsetof(st, m) \
    ((size_t) ( (char *)&((st *)(0))->m - (char *)0 ))

Why subtract (char *)0 in the Wikipedia version? Is there any case where that would actually make a difference?

+4  A: 

The standard does not require the NULL pointer to evaluate to the bit pattern 0 but can evaluate to a platform specific value.

Doing the subtraction guarantees that when converted to an integer value, NULL is 0.

R Samuel Klatchko
That's not the reason; any reasonable constant other than `0` could also be used. See my answer below.
Loadmaster
+5  A: 

The first version converts a pointer into an integer with a cast, which is not portable.

The second version is more portable across a wider variety of compilers, because it relies on pointer arithmetic by the compiler to get an integer result instead of a typecast.

BTW, I was the editor that added the original code to the Wiki entry, which was the Linux form. Later editors changed it to the more portable version.

Loadmaster
Thanks for the response. Followup question "Does (size_t)((char *)0) ever not evaluate to 0?" posted at http://stackoverflow.com/questions/2581522/does-size-tchar-0-ever-not-evaluate-to-0
Bruce Christensen
Interesting - but is the second expression really any more portable, since pointer arithmetic on null pointers is undefined?
Michael Burr
It's more portable in the sense that it works on more CPUs. Being undefined (by the standard) does not mean it can't work for some specific architecture(s).
Loadmaster