views:

123

answers:

4

So I have this code:

p.Value = 1;
decimal avg = p.Value * 100 / 10000;
string prntout = p.Key + " : " + avg.ToString();
Console.WriteLine(prntout);

But the program prints out 0, instead of 0.01. p.Value is an int. How do I fix that?

+3  A: 

Change one of the literals into a decimal:

decimal avg = p.Value * 100m / 10000;

Now, to explain why this works:

Lets process the original line one operation at a time, substituting 1 for p.Value:

decimal avg = 1 * 100 / 10000; // int multiplication
decimal avg = 100 / 10000; // int division, remainder tossed out
decimal avg = (decimal) 0; // implicit cast

By changing 100 to 100m, it's now:

decimal avg = 1 * 100m / 10000; // decimal multiplication
decimal avg = 100m / 10000; // decimal division
decimal avg = 0.01m;
R. Bemrose
Side note: Decimal's literal type code is `m`, presumably for money.
R. Bemrose
+1  A: 

The expression p.Value * 100 / 10000 is only using integer types, so evaluates according to integer division rules.

Change one (or more) of the parameters to a decimal and it will perform as expected:

p.Value * 100 / 10000m
Oded
+1  A: 

Try changing this:

decimal avg = p.Value * 100 / 10000;

to

decimal avg = Convert.ToDecimal(p.Value) * 100.0 / 10000.0;

Your previous versions used all integers.

LittleBobbyTables
A: 

If P.Value is an integer you will probably lose the fraction on this line:

decimal avg = p.Value * 100 / 10000;

So you can do this:

decimal avg = (decimal)P.Value * 100 / 10000;

Hope it helps.

fmikowski
Thank you all for the answers, it did work :)
webyacusa