This seems like a bug to me...
I accept that automatic properties, defined as such:
public decimal? Total { get; set; }
Will be null when they are first accessed. They haven't been initialized, so of course they are null.
But, even after setting their value through +=, this decimal? still remains null. So after:
Total += 8;
Total is still null. How can this be correct? I understand that it's doing a (null + 8), but seems strange that it doesn't pick up that it means it should just be set to 8...
Addendums:
I made the "null + 8" point in my question - but notice that it works with strings. So, it does null + "hello" just fine, and returns "hello". Therefore, behind the scenes, it is initializing the string to a string object with the value of "hello". The behavior should be the same for the other types, IMO. It might be because a string can accept a null as a value, but still, a null string is not an initialized object, correct?
Perhaps it's just because a string isn't a nullable...