views:

135

answers:

5

What are the points that should be kept in mind while writing code that should be portable on both 32 bit and 64 bit machines?

Thinking more on this, I feel if you can add your experience interms of issues faced, that would help.

Adding further on this, I once faced a problem due to a missing prototype for a function which was returning returning a pointer. When I ported the same to a 64 bit machine, the code was crashing and I had no clue about the reason for quite some time, later realised that all missing prototypes are assumed to return int causing the problem.

Any such examples can help.

EDIT: Adding to community wiki.

+2  A: 

Gotchas:

  1. Casting pointers to integer types is dangerous
  2. Data structure sizes can change
  3. Watch out for sign extension
  4. Different ABI?

Some tips & tricks I've found helpful:

  1. Get yourself a native-size integer type (from a header or typedef your own) and use it when you have variables that don't care about size.
  2. Use explicit variable types wherever possible (u_int64_t, int_32_t, etc.)
Carl Norum
+3  A: 
  • Some integral types may have different sizes
  • Pointers are of different lengths
  • Structure padding
  • Alignment

On Windows, there is only calling convention on x64 as opposed to the multiple on a regular x32 machine.

Things get murkier when you have some components which are 32-bit and some 64-bit. On Windows, I ended up writing a COM service to get them to talk.

dirkgently
+3  A: 

Pushing pointers onto the stack takes up twice as much space. Stack size may not change between OS Versions though, causing code that runs fine in 32-bit to mysteriously fail when compiled and run unchanged on 64-bit. Don't ask me how I know this.

No Refunds No Returns
I have never heard of this one. Does this really affect?
Jay
uh ... yeah ...
No Refunds No Returns
This is true only when the compiler can't/don't store the pointers in registers.
philippe
+2  A: 

Write automated tests and run them regularly on both platforms.

Jesse Weigert
+3  A: 

sizeof(int) might != sizeof(void*)

alignment. Its possible that alignments needs might change. This can expose bugs where you have been mistreating things that should have been aligned but were only aligned by accident in 32 bit (or on a processor that doesnt care)

dont pass 0 into varargs if the receiver is expecting a pointer. This is painful in C++ where savvy devs know that 0 is a valid null pointer. A C dev will usually use NULL so you are probably OK

pm100
+1 Now I am wondering if that behavior is compliant. The standard says NULL evaluates to 0. Shouldn't it be promoted to a (void*) by default? Not that you shouldn't use NULL but reading the standard, headers writers could as well define NULL as 0. http://www.open-std.org/JTC1/SC22/wg14/www/docs/n1425.pdf See point 3 on 6.3.2.3. It sounds like GCC is wrong or will be wrong once that is the current standard.
jbcreix
gcc converted NULL to nullptr in my case. So it worked. I agree that if NULL were defined as 0 then it would not have fixed my bug, in that case I would have to explicitly say (void*)0
pm100