views:

578

answers:

2

Hi, I have following code:


    float totalSpent;
    int intBudget;
    float moneyLeft;

totalSpent += Amount;
moneyLeft =  intBudget - totalSpent;

And this is how it looks in debugger: http://www.braginski.com/math.tiff

Why would moneyLeft calculated by the code above is .02 different compared to the expression calculated by the debugger?

Expression windows is correct, yet code above produces wrong by .02 result. It only happens for a very large numbers (yet way below int limit)

thanks

+3  A: 

Floating point will always produce strange results with money type calculations.

The golden rule is that floating point is good for things you measure litres,yards,lightyears,bushels etc. etc. but not for things you count like sheep, beans, buttons etc.

Most money calculations are to do with counting pennies so use integer math and you wont get the strange results. Either use a fixed decimal arithimatic library (which would probably be overkill on an iPhone) or store your amounts as whole numbers of cents and only convert to $ and cents on display.

James Anderson
Thank you, James. See comment above. Since I never programmed financial kinds of app, it never occurred to me that float is a wrong type
leon
+3  A: 
Crashworks
Excellent explanations, thanks! I just re-wrote the entire app (thankfully all many calculation were in one class) to use NSDecimalNumber and it started to work perfectly
leon