Yesterday I asked this general question about decimals and their internal precisions. Here is a specific question about the scenario I'm trying to address.
I have column in SqlServer that is typed - decimal(18,6). When I fetch these values, the .net decimals that are created match the precision in the database. They look like this:
1.100000
0.960000
0.939000
0.844400
0.912340
I need to present (ToString) these values according to these rules:
- Non-zeroes are always shown.
- Trailing zeroes on or before the nth decimal place are shown.
- Trailing zeroes after the nth decimal place are not shown.
So, this is what I want when n is 3:
1.100
0.960
0.939
0.8444
0.91234
Now, I have written some code that ToString's the decimal - removing all trailing zeroes, and then analyzes the string looking for a decimal point and counts the number of decimal places to see how many trailing zeroes need to be added back on. Is there a better way to accomplish this?
Also, I know I said ToString in the question above... but if I could modify the decimals on their way out of my data-access layer, such that consumers always get decimals with the proper precision, that would be better. Is it possible to perform this operation on the decimal itself without string manipulation?