views:

31

answers:

1

I am working on tool to optimize linq to sql queries. Basically it intercepts the linq execution pipeline and makes some optimizations like for example removing a redundant join from a query. Of course, there is an overhead in the execution time before the query gets executed in the dbms, but then, the query should be processed faster. I don't want to use a sql profiler because I know that the generated query will be perform better in the dbms than the original one, I am looking for a correct way of measuring the global time between the creation of the query in linq and the end of its execution. Currently, I am using the Stopwatch class and my code looks something like this:

var sw = new Stopwatch();
sw.Start();
const int amount = 100;
for (var i = 0; i < amount; i++)
{
   ExecuteNonOptimizedQuery();
}
sw.Stop();
Console.Writeline("Executing the query {2} times took: {0}ms. On average, each query took: {1}ms", sw.ElapsedMilliseconds, sw.ElapsedMilliseconds / amount, amount);

Basically the ExecutenNonOptimizedQuery() method creates a new DataContext, creates a query and then iterates over the results.

I did this for both versions of the query, the normal one and the optimized one. I took the idea from this post from Frans Bouma.

Is there any other approach/considerations I should take?

Thanks in advance!

+1  A: 

You could Run under a profiler. But an instrumented build will impact the performance of the code, possibly distorting the result (with a large part of the total execution time in SQL Server, this is very likely). A sampling profiler might help. (The Visual Studio Team System profiler can do both.)

For an isolated test to compare to approaches using Stopwatch is generally the preferred approach. You can remove the time show in SQL Profiler from the stopwatch time to get the time in client code (including your post-processing).

Richard
Thanks for your answer!
uvita