views:

97

answers:

3

Hi, I am currently trying to ring more performance out of my reporting website which uses linq to sql and an sql server express 2008 database. I am finding that as I now approach a million rows in on of my more 'ugly' tables that performance is becoming a real issue, with one report in particular taking 3 minutes to generate.

Essentially, I have a loop that, for each user, hits the database and grabs a collection of data on them. This data is then queried in various ways (and more rows loaded as needed) until I have a nice little summary object that I can fire off to a set of silverlight charts. Lazy loading comes is used and the reporting pulls into data from around 8 linked tables.

The problem is I don't know where the bottleneck now is and how to improve performance. Due to certain constraints I was forced to use uniqueidentifiers for a number of primary keys in the tables involved - could this be an issue?

Basically, I need to put time into increasing performance but don't have enough to do that with both the database or the linq to sql. Is there anyway I can see where the bottlenecks are?

As im running express I don't have access to the profiler. I am considering rewriting my queries into compiled linq to sql but fear the database may be the culprit.

I understand this question is a bit open ended and its hard to answer without knowing much more about my setup (database schema etc) but any advice on how to find out where the bottlenecks are is more appreciated!

Thanks

UPDATE: Thanks for all the great advice guys, and some links to some great tools.

UPDATE for those interested I have been unable to make my queries any quicker through tweaking the linq. the problem seems to be that the majority of my database access code takes place in a loop. I can't see a way around it. Basically I am building up a report by looking through a number of users data - hence the loop. Pulling all the records up front seems a bit crazy - 800,000 + rows. My gut feeling is that there is a much better way, but its a technological leap too far for me!

However, adding another index to one of the foreign keys in one of the tables boosted performance so now the report takes 20 seconds to generate as opposed to 3 minutes!

+2  A: 

There are 2 tools I use for this, LinqPad and the Visual Studio Debugger. First, check out LinqPad, even the free version is very powerful, showing you execution time, the SQL generated and you can use it to run any code snippet...it's tremendously useful.

Second, you can use the Visual studio debugger, this is something we use on our DataContext (note: only use this in debug, it's a performance hit and completely unnecessary outside of debugging)

#if DEBUG
  private readonly Stopwatch Watch = new Stopwatch();

  private static void Connection_StateChange(object sender, StateChangeEventArgs e)
  {
    if (e.OriginalState == ConnectionState.Closed && e.CurrentState == ConnectionState.Open) 
    {
      Current.Watch.Start();
    }
    else if (e.OriginalState == ConnectionState.Open && e.CurrentState == ConnectionState.Closed)
    {
      Current.Watch.Stop();

      string msg = string.Format("SQL took {0}ms", Current.Watch.ElapsedMilliseconds);
      Trace.WriteLine(msg);
    }
  }
#endif

private static DataContext New
{
  get
  {
    var dc = new DataContext(ConnectionString);
#if DEBUG
    if (Debugger.IsAttached)
    {
      dc.Connection.StateChange += Connection_StateChange;
      dc.Log = new DebugWriter();
    }
#endif
    return dc;
  }
}

In a debug build, as an operation completes with each context, we see the timestamp in the debug window and the SQL it ran. The DebugWriter class you see can be found here (Credit: Kris Vandermotten). We can quickly see if a query's taking a while. To use it we just initiate a DataContext by:

var DB = DataContext.New;

(The profiler is not an option for me since we don't use SQL server, this answer is simply to give you some alternatives that have been very useful for me)

Nick Craver
+2  A: 

I used this excelent tool: Linq2Sql profiler. It works on the application side, so there is no need for database server profiling functionality.

You have to add one line of initialization code to your application and then in separate desktop application profiler shows you SQL query for each LINQ query with exact line of code where it was executed (cs or aspx), database time and application time of executions and it even detects some common performance problems like n+1 queries (query executed for iteration) or unbounded datasets. You have to pay for it, but the trial version is also available.

PanJanek
Direct link - http://l2sprof.com/ - we use this here on Stack Overflow; truly a wonderful asset when tuning any queries!
Jarrod Dixon
Thanks for the link to linq2sql profiler - a great tool!
Sergio
+2  A: 

As you're using SQL Express which doesn't have Profiler, there is a free third party profiler you can download here. I've used it when running SQL Express. That will allow you to trace what's going on in the database.

Also, you can query the Dynamic Management Views to see what the costly queries are: e.g. TOP 10 queries that have taken the most time

SELECT TOP 10 t.text, q.*, p.query_plan
FROM sys.dm_exec_query_stats q
    CROSS APPLY sys.dm_exec_sql_text(q.sql_handle) t
    CROSS APPLY sys.dm_exec_query_plan (q.plan_handle) AS p
ORDER BY q.total_worker_time DESC
AdaTheDev
+1 I would get the SQL developer edition with a copy of the production database for things like this, then you do get Profiler. But if you need to profile live Express databases, that looks like a good solution.
SqlACID