I have a long double
constant that I am setting either as const or not-const. It is longer (40 digits) than the precision of a long double
on my test workstation (19 digits).
When I print it out, it no longer is displayed at 19 digits of precision, but at 16.
Here is the code I am testing:
#include <iostream>
#include <iomanip>
#include <limits>
using namespace std;
int main ()
{
const long double constLog2 = 0.6931471805599453094172321214581765680755;
long double log2 = 0.6931471805599453094172321214581765680755;
cout << numeric_limits<long double>::digits10 + 1 << endl;
cout << "const via cout: " << setprecision(19) << constLog2 << endl;
cout << "non-const via cout: " << setprecision(19) << log2 << endl;
printf("const via printf: %.19Lf\n", constLog2);
printf("non-const via printf: %.19Lf\n", log2);
return 0;
}
Output:
$ ./a.out
19
const via cout: 0.6931471805599452862
non-const via cout: 0.6931471805599452862
const via printf: 0.6931471805599452862
non-const via printf: 0.6931471805599452862
I would expect 0.6931471805599453094
but instead get 0.6931471805599452862
.
Is there a reason that the 19 digits of precision are cut to 16 digits?
Here is my environment:
$ gcc --version
i686-apple-darwin9-g++-4.0.1 (GCC) 4.0.1 (Apple Inc. build 5490)
I am seeing the same problem with other versions of gcc, e.g.:
$ gcc --version
g++ (GCC) 3.4.6 20060404 (Red Hat 3.4.6-10)
I can look into NTL or other libraries but I'm curious what is causing this. Thanks for your insight.