views:

101

answers:

3
Double dblValue = 0.0001;
Boolean a = (dblValue >= (1 / 1000));
Boolean b = (dblValue >= 0.001);
Console.WriteLine("dblValue >= (1 / 1000) is " + a);
Console.WriteLine("dblValue >= 0.001 is " + b);
Console.ReadLine();

The above C# code evaluates 'a' to true and 'b' to false. In VB.NET, the equivalent code evaluates 'a' to false and 'b' to false. Why would 'a' evaluate to true?

Is there an implicit conversion I'm missing here - and why doesn't it affect VB.NET (Strict)?

+14  A: 

The expression 1 / 1000 is evaluated (at compile time in this case, although it's irrelevant really) using integer arithmetic in C#, so evaluates to 0. Use 1.0 / 1000 1 instead to force double arithmetic to be used.

I believe VB always uses floating point arithmetic for /, and you have to use \ if you want to perform division using integer arithmetic, which is why you're seeing different behaviour there.


1 Or, as per comments, use 1d or (double) 1 or anything else that will force either of the operands to be considered to be of type double.

Jon Skeet
Or append a `d` to either argument to explicitly make it a double.
Joey
Shiny. Thanks :)
Rushyo
+2  A: 

Because 1/1000 is an integer expression, yielding 0

Henk Holterman
+5  A: 

1 and 1000 are both integers, so the result will be an integer (0 in this case). You need to force the use of doubles to complete the math.

Boolean b = (dblValue >= ((double) 1/(double) 1000));

or

Boolean b = (dblValue >= (1d / 1000d));

Will give you the result you're expecting.

Justin Niessner