int a;
printf("%d\n", a);
I wonder if %d
is a cast?
No, it is part of the format specifier string for printf() function's first argument; the format string. It will print out a decimal representation of that int you passed as the second argument.
It is not. It is just a "hint" for printf() function to treat the 'a' argument as an 'int'
In any case it won't be a cast but a reinterpretation (like getting the address, casting to a pointer of a different type and then getting the contents as a new type).
Example:
printf("%d\n", 1.5);
won't print integer 1
, but the integer value of the representation of 1.5
in IEEE 754. If you want to cast, you must explicitly put (int)
in front of the value.
No, it is a format specifier. It has semantic meaning only to the formatted I/O functions and is not part of the C language itself. You could equally write yopur own
All it does is specify the 'human readable' representation in which to present an int value; there is no type conversion or translation.
No, it's not a cast, but I suggest you take a look at the source for printf() to understand this. There's nothing special about printf() -- it's just a varargs function like any other. It's one of the first functions you learn in C, usually well before you learn varargs, so it often sticks out in people's minds as special when it's really not. A quick study of the source will probably be enlightening.
When you pass a format string to printf(), you're telling the function what to expect in its argument list (generally on the stack), but that might not agree with what you actually put there. With %d, you're telling printf() to take the next integer-sized chunk of bytes off the argument list and format those bytes as if they represent a signed decimal number. So when printf() parses the format string and encounters a %d, it will probably do something like:
int num = va_arg(args, int);
And then format and output the bytes in "num" as if they were an integer, regardless of what kind of argument you actually passed. If you put a float in the arguments where printf() is told to expect an integer, the output will be a decimal representation of the IEEE floating point bytes -- probably not what you intended, and not what a true cast would have done.