When constructing LINQ expressions (for me, linq to objects) there are many ways to accomplish something, some much, much better and more efficient than others.
- Is there a good way to "tune" or optimize these expressions?
- What fundamental metrics do folks employ and how do you gather them?
- Is there a way to get at "total iterations" count or some other metric, where you could "know" that lower means better?
EDIT
Thanks Richard/Jon for your answers.
What it seems that I really want is a way to get a simple Operation Count "OCount" for a LINQ Expression though I am not sure that the hooks exist in LINQ to allow it. Suppose that I have a target level of performance for a specific machine hardware (an SLA). Ideally, I would add a unit test to confirm that the typical data moved through that query would process within that allotted time (from the SLA). Problem is that this would be run on the build server/developers machine/etc. which probably bears little resemblance to the machine hardware of the SLA. So the idea is that I would determine an acceptable max "OCount" for the expression, knowing that if the OCount is less than X, it will certainly provide acceptable performance under the SLA on the target "typical" hardware. If the OCount exceeds this threshold, the build/unit test would generate a warning. Ideally, I would like to have something like this (pseudocode-ish):
var results = [big linq expression run against test dataset];
Assert.IsLess(MAXALLOWABLE_OCOUNT, results.OCount)
where results.OCount would simply give me the total iterations (n) necessary to produce the result set.
Why would I like this??
Well, with even a moderately sized LINQ expression, a small change/addition can have HUGE effects on the performance as a consequence of increasing the overall operation count. The application code would still pass all unit tests as it would still produce the correct result, but work miserably slowly when deployed.
The other reason is for simple learning. If you do something and the OCount goes up or down by an order of magnitude, then you learn something.
EDIT #2 I'll throw in a potential answer as well. It is not mine, it comes from Cameron MacFarland from another question that I asked that spawned this one. Turns out, I think the answer to that one could work here in a unit test environment like the one that I described in the first edit to this question.
The essence of it would be to create the test datasets in the unit test fixture that you feed into the LINQ expression in the way outlined in this answer and then add up the Iteration counts and compare to the max allowable iteration count.