Inspired by this topic, I decided to write a simple program that does exactly that.
The logic isn't complicated and I have a working program 75% of the time.. The amount of asked numbers is defined as #define BUFSIZE x
, where x
can be an arbitrary int.
The problem arises when ((BUFSIZE+1) % sizeof(int)) == 0
.
So for example, if BUFSIZE=10
, my program behaves correctly, when BUFSIZE=11
I get odd behaviour.
Here is the sourcecode:
#include <stdio.h>
#include <stdlib.h>
#define BUFSIZE 7
int max(int *buf);
int main()
{
int bufsize = BUFSIZE, *buf = malloc(sizeof(int[bufsize]));
// read values
int *ptr = buf;
while(--bufsize + 1)
{
printf("Input %d: ", BUFSIZE - bufsize);
scanf("%d", ptr);
++ptr;
}
// reset pointer and determine max
ptr = buf;
printf("\nMax: %d\n", max(ptr));
// cleanup
free(buf);
ptr = NULL;
buf = NULL;
exit(EXIT_SUCCESS);
}
int max(int *buf)
{
int max = 0;
while(*buf)
{
printf("%d\n", *buf);
if(*buf > max) max = *buf;
++buf;
}
return max;
}
And some sample output for BUFSIZE=2 (correct) and BUFSIZE=3 (incorrect).
suze:/home/born05/htdocs/experiments/c# gcc input.c && ./a.out
Input 1: 12
Input 2: 23
12
23
Max: 23
suze:/home/born05/htdocs/experiments/c# gcc input.c && ./a.out
Input 1: 12
Input 2: 23
Input 3: 34
12
23
34
135153
Max: 135153
I have the feeling it is something extremely logical but I can't put my finger on the exact cause of this misbehaviour. Could someone point out the (perhaps obvious) flaw to me?