I ran the following code, and found some strange output.
int
mean_ansi (int num1, int num2)
{
printf ("In %s\n", __FUNCTION__);
printf ("num1,num2 is %d,%d\n", num1, num2);
return (num1 + num2) / 2;
}
int
mean_K_and_R (num1, num2)
int num1, num2;
{
printf ("In %s\n", __FUNCTION__);
printf ("num1,num2 is %d,%d\n", num1, num2);
return (num1 + num2) / 2;
}
int
main ()
{
int i = 6;
double f = 1.0;
printf ("In %s\n", __FUNCTION__);
printf ("[f,i] = [%f,%d]\n", f, i);
/* deliberate mistakes */
mean_ansi (f, i);
mean_K_and_R (f, i);
return 0;
}
Output:
In main
[f,i] = [1.000000,6]
In mean_ansi
num1,num2 is 1,6
In mean_K_and_R
num1,num2 is 0,1072693248
Can anyone explain this behavior.
I saw the assembly but could not make out much.
Is there a difference in the way function arguments are pushed on the stack in both these syntaxes?