In Java I run:
System.out.println(Math.log(249.0/251.0));
Output: -0.008000042667076265
In C# I run: <- fixed
Math.Log (x/y); \\where x, y are almost assuredly 249.0 and 251.0 respectively
Output: -0.175281838 (printed out later in the program)
Google claims:
Log(249.0/251.0)
Output: -0.00347437439
And MacOS claims about the same thing (the first difference between google and Snow Leopard is at about 10^-8, which is negligible.
Is there any reason that these results should all vary so widely or am I missing something very obvious? (I did check that java and C# both use base e). Even mildly different values of e don't seem to account for such a big difference. Any suggestions?
EDIT:
Verifying on Wolfram Alpha seems to suggest that Java is right (or that Wolfram Alpha uses Java Math for logarithms...) and that my C# program doesn't have the right input, but I am disinclined to believe this because taking (e^(google result) - 249/251) gives me an error of 0.0044 which is pretty big in my opinion, suggesting that there is a different problem at hand...