Try 285212672ULL
; if you write it without suffixes, you'll find the compiler treats it as a regular integer. The reason it's working in a variable is because the integer is being cast up to an unsigned long long
in the assignment, so that the value passed to printf()
is the right type.
And before you ask, no, the compiler probably isn't smart enough to figure it out from the "%llu
" in the printf()
format string. That's a different level of abstraction. The compiler is responsible for the language syntax, printf()
semantics are not part of the syntax, it's a runtime library function (no different really from your own functions except that it's included in the standard).
Consider the following code for a 32-bit int and 64-bit unsigned long long system:
#include <stdio.h>
int main (void) {
printf ("%llu\n",1,2);
printf ("%llu\n",1ULL,2);
return 0;
}
which outputs:
8589934593
1
In the first case, the two 32-bit integers 1 and 2 are pushed on the stack and printf()
interprets that as a single 64-bit ULL value, 2 x 232 + 1. The 2
argument is being inadvertently included in the ULL value.
In the second, you actually push the 64-bit 1-value and a superfluous 32-bit integer 2
, which is ignored.
Note that this "getting out of step" between your format string and your actual arguments is a bad idea. Something like:
printf ("%llu %s %d\n", 0, "hello", 0);
is likely to crash because the 32-bit "hello"
pointer will be consumed by the %llu
and %s
will try to de-reference the final 0
argument. The following "picture" illustrates this (let's assume that cells are 32-bits and that the "hello" string is stored at 0xbf000000.
What you pass Stack frames What printf() uses
+------------+
0 | 0 | \
+------------+ > 64-bit value for %llu.
"hello" | 0xbf000000 | /
+------------+
0 | 0 | value for %s (likely core dump here).
+------------+
| ? | value for %d (could be anything).
+------------+