views:

257

answers:

5

With a 1D array, I can use the sum method to get the sum of all the values.

int[] array = {6,3,1};
Console.WriteLine(array.Sum());

With a multidimensional array (3D in my case), this can't be done. Obviously I could go all foreach on it, but this seems verbose and I suspect it will perform badly.

Is there a way to flatten the array? Or a nice way just to get the sum that I've not seen?

+7  A: 

Sum does exactly foreach. There is no magic behind them. If you are so performance hungry use for instead of foreach. You can do this in parallel also, this operation can be easily parallelized.

Andrey
+1, Good point, parallelize if you can @Tom Wright
bassfriend
Perhaps this should be a separate question, but how would I parallelize it? (Very new to C#...)
Tom Wright
Check out this two links:http://msdn.microsoft.com/en-us/magazine/cc163329.aspxhttp://blogs.msdn.com/b/pfxteam/Especially at the blog, you can download a bunch of samples (ParallelExtensionsExtras)
bassfriend
@Tom Wright it is one of the most famous parallel tasks just google "parallel sum computation C#" the basic idea is to divide array to parts and calculate sub-sums in parallel threads. PLINQ makes it even easier. But before you start optimizing something make sure that it is actual bottleneck
Andrey
@Tom Wright - Depending on which version of C# you're using, you could use Parallel.ForEach: http://msdn.microsoft.com/en-us/library/system.threading.tasks.parallel.foreach.aspx
Justin Niessner
Wow, thanks guys!
Tom Wright
+2  A: 

Why should a foreach perform badly? You have to read every value at least once to calculate the sum. There is no way around this (assuming "random" values, of course). So maybe there is a more beautyful way, but not a more performant one (speaking in terms of Big O).

tanascius
`for` is a bit cheaper.
Andrey
I'm not sure but according to some microbenchmarks i've seen there can be a performace differences between them.
bassfriend
@Andrey, bassfriend: you are right, maybe you get more performant implementations, but all of them rely on looping over every single value. Microbenchmarking as well as parallelizing will probably help, but my point is that an algorithm with *foreach/for* has not a bad performance per se. Linq or what ever can not skip the step of looping over all values.
tanascius
+1  A: 

If you have a jagged array and would like clean code you could use

int[][] array = { new []{ 6, 3, 1 }, new []{ 6, 3, 1 } };
Console.WriteLine(array.Sum(i => i.Sum()));
Yuriy Faktorovich
+2  A: 

this'll do the trick.

var i = array.SelectMany(j => j).Sum()

you could parallelize this in .net 4 like this

var i = array.AsParallel().SelectMany(k => k).Sum();
the_ajp
Haven't benchmarked it but i think array.Sum(j => j.Sum()) should be faster and in no way less readable. SelectMany().Sum() has to move twice through the array (AFAIK), once to generate the new enumerable and again to calculate the sum (unless LINQ optimizes this, not sure about that).
dbemerlin
sum takes and ienum so it will prolly keep a running total and loop trough it. so for every int selectmany delivers it will add it to the total so it will only go through everything once. But I haven't looked into it either.
the_ajp
A: 

How many entries will the lists contain?

A sum-function of a 3-dimensional array using nested for-loops will run in O(n^3). Assuming you don't have millions of entries it shouldn't really be a performance issue.

Doing it in parallel would give you theoretical running time of O(3*n) = O(n), but really it isn't that big of an improvement in actual seconds unless you, like mentioned above, have a huge load of entries.

Jes
The run time on a 3D array is O(n) where n is the number of elements. For example, you could have a 1x1x10 array. That is still just 10 additions. If you have 4 degrees of paralellism, the complexity is O(1/4*n+4) = O(n).
John Gietzen
I assumed the arrays were 3 x n x n.
Jes