views:

96

answers:

2

In Java I run:

System.out.println(Math.log(249.0/251.0));

Output: -0.008000042667076265

In C# I run: <- fixed

Math.Log (x/y); \\where x, y are almost assuredly 249.0 and 251.0 respectively

Output: -0.175281838 (printed out later in the program)

Google claims:

Log(249.0/251.0)

Output: -0.00347437439

And MacOS claims about the same thing (the first difference between google and Snow Leopard is at about 10^-8, which is negligible.

Is there any reason that these results should all vary so widely or am I missing something very obvious? (I did check that java and C# both use base e). Even mildly different values of e don't seem to account for such a big difference. Any suggestions?

EDIT:

Verifying on Wolfram Alpha seems to suggest that Java is right (or that Wolfram Alpha uses Java Math for logarithms...) and that my C# program doesn't have the right input, but I am disinclined to believe this because taking (e^(google result) - 249/251) gives me an error of 0.0044 which is pretty big in my opinion, suggesting that there is a different problem at hand...

+7  A: 

You're looking at logarithms with different bases:

  • Java's System.out.println(Math.log(249.0/251.0)); is a natural log (base e)
  • C#'s Math.Log (x,y); gives the log of x with base specified by y
  • Google's Log(249.0/251.0) gives the log base 10

Though I don't get the result you do from C# (Math.Log( 249.0, 251.0) == 0.998552147171426).

Michael Burr
my apologies <- my C# code was miscopied (I had really complicated variable names and input translation on that line, though it does potentially (and certainly partially) answer my question: Google and Java are different bases) so the question is now Java vs C#
piggles
I get the same result in C# as in Java with `Math.Log(249.0/251.0)` - at least to 16th decimal place.
Michael Burr
Hmmm interesting. I will spend another five hours poring over the C# code, in that case. Would someone be able to assure me that, say, an AMD processor, would return the same as an Intel processor?
piggles
Instead of poring over the C# code, you might be able to get your answer much more readily stepping through and/or setting appropriate breakpoints in the debugger to ensure the `x` and `y` values are correct (as well as the result) when passed to `Math.Log()` and that that result isn't being modified at some point before being displayed.
Michael Burr
unfortunately I can't seem to run .Net on my Mac, which makes me sad.
piggles
@Mechko, basically you can count on the same program generating the same output regardless of processor (unless you do exotic stuff with SSE3 etc). The "single step in debugger" advice is probably the best you can get right now.
Thorbjørn Ravn Andersen
@Thor Thanks for confirming the architecture stuff.
piggles
+3  A: 

You have a mistake somewhere in your C# program between where the log is calculated and where it is printed out. Math.Log gives the correct answer:

class P
{
  static void Main()
  {
      System.Console.WriteLine(System.Math.Log(249.0/251.0));
  }
}

prints out -0.00800004266707626

Eric Lippert