I have some piece of code that behaves differently under Mac OSX and Linux (Ubuntu, Fedora, ...). This is regarding type casting in arithmetic operations within printf statements. The code is compiled with gcc/g++.
The following
#include <stdio.h>
int main () {
float days = (float) (153*86400) / 86400.0;
printf ("%f\n", days);
float foo = days / 30.6;
printf ("%d\n", (int) foo);
printf ("%d\n", (int) (days / 30.6));
return 0;
}
generates on Linux
153.000000
5
4
and on Mac OSX
153.000000
5
5
Why?
To my surprise this here works on both Mac OSX and Linux
printf ("%d\n", (int) (((float)(153 * 86400) / 86400.0) / 30.6));
printf ("%d\n", (int) (153 / 30.6));
printf ("%.16f\n", (153 / 30.6));
Why? I don't have a clue at all. THX.