For example, when I'm dividing two ints and want a float returned, I superstitiously write something like this:
int a = 2, b = 3;
float c = (float)a / (float)b;
If I do not cast a
and b
to floats, it'll do integer division and return an int.
Similarly, if I want to multiply a signed 8-bit number with an unsigned 8-bit number, I will cast them to signed 16-bit numbers before multiplying for fear of overflow:
u8 a = 255;
s8 b = -127;
s16 = (s16)a * (s16)b;
How exactly does the compiler behave in these situations when not casting at all or when only casting one of the variables? Do I really need to explicitly cast all of the variables, or just the one on the left, or the one on the right?