I want a rounding method on double values in C#. It needs to be able to round a double value to any rounding precision value. My code on hand looks like:
public static double RoundI(double number, double roundingInterval) {
if (roundingInterval == 0.0)
{
return;
}
double intv = Math.Abs(roundingInterval);
double sign = Math.Sign(number);
double val = Math.Abs(number);
double valIntvRatio = val / intv;
double k = Math.Floor(valIntvRatio);
double m = valIntvRatio - k;
bool mGreaterThanMidPoint = ((m - 0.5) >= 1e-14) ? true : false;
bool mInMidpoint = (Math.Abs(m - 0.5) < 1e-14) ? true : false;
return (mGreaterThanMidPoint || mInMidpoint) ? sign * ((k + 1) * intv) : sign * (k * intv);
}
So RoundI(100, 3) should give 99 and RoundI(1.2345, 0.001) should give 1.235.
The problem is, RoundI(1.275, 0.01) returns 1.27, rather than 1.28. This is because when executing double valIntvRatio = val/intv, that is, double valIntvRatio = 1.275 / 0.01, it gives 0.12749999999999. I know this is a problem with double representation in any programming language. My question is, is there a standard code to do things like this, without the need to worry about precision on double? Here I set the tolerant to 1e-14, but this is too restrict for this problem and I don't know what is the correct tolerance to be set. Thank you for any help.