I'm curious:
If you do a printf("%f", number);
what is the precision of the statement? I.e. How many decimal places will show up? Is this compiler dependent?
I'm curious:
If you do a printf("%f", number);
what is the precision of the statement? I.e. How many decimal places will show up? Is this compiler dependent?
The ANSI C standard, in section 7.9.6.1, says this about the f format specifier:
If the precision is missing, 6 digits are given
The default precision for %f
is 6 digits (see ISO C99 specification, 7.19.6.1/7).
The book, C: A Reference Manual states that if no precision is specified then the default precision is 6 (i.e. 6 digits after the decimal point).
One caveat is if the number is inf (i.e. 1.0/0.0) or NaN then C99 specifies that the output should be inf, -inf, infinity, -infinity, or nan etc....