I am seeing an intriguing situation rounding Currency in C# (VS 2008 SP1). Below is an image of the test cases:
I was expecting cases five, six, and seven (my bad on not numbering them in the output) to round the number up to a penny.
Here is my test code:
static void Main(string[] args)
{
decimal one = 10.994m;
decimal two = 10.995m;
decimal three = 1.009m;
decimal four = 0.0044m;
decimal five = 0.0045m;
decimal six = 0.0046m;
decimal seven = 0.0049m;
decimal eight = 0.0050m;
Console.WriteLine(one + ": " + one.ToString("C"));
Console.WriteLine(two + ": " + two.ToString("C"));
Console.WriteLine(three + ": " + three.ToString("C"));
Console.WriteLine(four + ": " + four.ToString("C"));
Console.WriteLine(five + ": " + five.ToString("C"));
Console.WriteLine(six + ": " + six.ToString("C"));
Console.WriteLine(seven + ": " + seven.ToString("C"));
Console.WriteLine(eight + ": " + eight.ToString("C"));
Console.ReadLine();
}
When I reflected into .ToString(string format) to see what was going on I found
public string ToString(string format)
{
return Number.FormatDecimal(this, format, NumberFormatInfo.CurrentInfo);
}
which has the call to
[MethodImpl(MethodImplOptions.InternalCall)]
public static extern string FormatDecimal(
decimal value,
string format,
NumberFormatInfo info);
Is there some logic in that call that says the granularity for my current culture settings for NumberFormatInfo is two decimal places for currencty so don't let the ten thousandths place roll the number up because it is insignificant?
How is this method implemented? Are we into bit shift land, or is something else going on?
Thanks for any insights.