What you are seeing is the difference between using Enumerable.Sum
and actually adding the values yourself.
The important thing here is the null
is not zero. At first glance you would think that singleSum
should equal 17 but that would mean that we would have to assign different semantics to null
based on the data type of the reference. The fact that this is an int?
makes no difference - null
is null
and should never be semantically equal with the numeric constant 0
.
The implementation of Enumerable.Sum
is designed to skip over any value that is null
in the sequence so that is why you are seeing the different behavior between the two tests. However the second test rightly returns null
as the compiler is smart enough to know that adding anything to null
yields null
.
Here is the implementation of Enumerable.Sum
that accepts a parameter of int?
:
public static int? Sum(this IEnumerable<int?> source)
{
if (source == null)
{
throw Error.ArgumentNull("source");
}
int num = 0;
foreach (int? nullable in source)
{
// As you can see here it is explicitly designed to
// skip over any null values
if (nullable.HasValue)
{
num += nullable.GetValueOrDefault();
}
}
return new int?(num);
}