Before you start laughing at such a simple question let me explain:
I am trying to determine how much change (percentage %) there is in an account over various indicators. This is not particularly hard. But how do you generally handle the cases where the current value is zero or the previous value is zero?
i.e.
This week: Earnings = $25.6
Last week: Earnings = $0.0
I currently calculate the % difference by the following formula:
If (CurrentValue > 0.0 && PreviousValue > 0.0) {
return (CurrentValue - PreviousValue) / PreviousValue;
} return 0.0;
if the earnings were zero in the previous week - what should the % difference be? +Infinity?
And inversely if the current week is zero? -Infinity?
Then to complicate things how would you handle this in a Linq-To-SQL query
Upside_Earnings = (statistics.Where(d => d.DateTime > first_startdate && d.DateTime <= first_enddate).Average(e => (double)e.Earnings) > zero &&
statistics.Where(d => d.DateTime > second_startdate && d.DateTime <= second_enddate).Average(e => (double)e.Earnings) > zero) ?
((statistics.Where(d => d.DateTime > first_startdate && d.DateTime <= first_enddate).Average(e => (double)e.Earnings) -
statistics.Where(d => d.DateTime > second_startdate && d.DateTime <= second_enddate).Average(e => (double)e.Earnings)) /
statistics.Where(d => d.DateTime > second_startdate && d.DateTime <= second_enddate).Average(e => (double)e.Earnings)) : zero,