I am testing basic math functions that'll return its mean/variance/standard deviation. The problem I am facing is that I cannot get the precision of "expected value" to math what is returned by the function. For example, if the variance function returns 50.5555555555566, even if I set the expected value explicitly to 50.5555555555566, it'll say they're two different doubles and the unit test fails.
Below is the actual output from the unit test:
Assert.AreEqual failed. Expected:<50.5555555555556>. Actual:<50.5555555555566>.
Can anyone advise on a way around this? I am using the built-in visual studio unit testing suite. Thanks.