tags:

views:

82

answers:

1

I read that PLinq will automatically use non parallel Linq if it finds PLinq to be more expensive. So I figured then why not use PLinq for everything (when possible) and let the runtime decide which one to use.

The apps will be deployed to multicore servers and I am OK to develop a little more code to deal with parallelism.

What are the pitfalls of using plinq as a default?

+2  A: 

One pit fall is you lose the ability to leverage ordering in sets.

Take the following code:

var results = new int { 0 ,1 ,2 ,3 };
var doSomethingSpecial = (from r in results.AsParallel() select r / 2).ToArray();

You can't count on the results coming in order so the result could be any permutations of the set. This is one of the largest pitfalls, in the sense if you are dealing with ordered data, then you could be losing the performance benefits due of the cost of sorting.

Another issue is you lose the ability to catch known exceptions. So i couldn't catch a null pointer exception (not saying you should ever do that) or even catch a FormatException.

There are a tons of reasonse why you should not always use Plinq in all cases, and i will highlight just one more. Don't read too uch into the "automatic use of non parallel Linq", it can only handle the barrier cases where the query is to simple, or would be too complex to run parallel.

Always keep in mind that the more use PLINQ the more resources you will be consuming on the server, which are taking away from other running threads.

Resources:

MSDN PLNQ white paper

Paul Kimmel on PLINQ

Nix
For guaranteed ordering you can just AsOrdered() though you obviously need to measure to find out if you gain anything in your algorithm since this does add overhead. As far as exceptions, you will get an AggregateException if anything occurs in the parallelized execution from which you can obtain the individual exceptions that occurred via the InnerException*s* property. As with every technology, the only way to know if it's going to benefit you is to actually measure using data sets representative of what you'll be processing in the real world.
Drew Marsh