views:

317

answers:

5

How would one go about profiling a few queries that are being run from an ASP.NET application? There is some software where I work that runs extremely slow because of the database (I think). The tables have indexes but it still drags because it's working with so much data. How can I profile to see where I can make a few minor improvements that will hopefully lead to larger speed improvements?

Edit: I'd like to add that the webserver likes to timeout during these long queries.

+4  A: 

To profile SQL Server, use the SQL Profiler.

And you can use ANTS Profiler from Red Gate to profile your code.

Forgotten Semicolon
SQL Profiler usually does the trick for me!
mattruma
Is there a way to use it without being an administrator?
Joe Philllips
You can actually use the latest version of ANTS Profiler to track SQL and I/O usage as well.
Mel Harbour
+4  A: 

Sql Server has some excellent tools to help you with this situation. These tools are built into Management Studio (which used to be called Enterprise Manager + Query Analyzer).

Use SQL Profiler to show you the actual queries coming from the web application.

Copy each of the problem queries out (the ones that eat up lots of CPU time or IO). Run the queries with "Display Actual Execution Plan". Hopefully you will see some obvious index that is missing.

You can also run the tuning wizard (the button is right next to "display actual execution plan". It will run the query and make suggestions.

Usually, if you already have indexes and queries are still running slow, you will need to re-write the queries in a different way.

Keeping all of your queries in stored procedures makes this job much easier.

Eric Z Beard
+1  A: 

I believe you have the answer you need to profile the queries. However, this is the easiest part of performance tuning. Once you know it is the queries and not the network or the app, how do you find and fix the problem?

Performance tuning is a complex thing. But there some places to look at first. You say you are returning lots of data? Are you returning more data than you need? Are you really returning only the columns and records you need? Returning 100 columns by using select * can be much slower than returning the 5 columns you are actually using.

Are your indexes and statistics up-to-date? Look up how to update statisistcs and re-index in BOL if you haven't done this in a awhile. Do you have indexes on all the join fields? How about the fields in the where clause.

Have you used a cursor? Have you used subqueries? How about union-if you are using it can it be changed to union all?

Are your queries sargable (google if unfamiliar with the term.)

Are you using distinct when you could use group by?

Are you getting locks?

There are many other things to look at these are just a starting place.

HLGEM
Good, concise suggestions.
Joe Philllips
+3  A: 

Another .NET profiler which plays nicely with ASP.NET is dotTrace. I have personally used it and found lots of bottlenecks in my code.

korchev
+1  A: 

If there is a particular query or stored procedure I want to tune, I have found turning on statistics before the query to be very useful:

SET STATISTICS TIME ON
SET STATISTICS IO ON

When you turn on statistics in Query Analyzer, the statistics are shown in the Messages tab of the Results pane.

IO statistics have been particularly useful for me, because it lets me know if I might need an index. If I see a high read count from the IO statistics, I might try adding different indexes to the affected tables. As I try an index, I run the query again to see if the read count has gone down. After a few iterations, I can usually find the best index(es) for the tables involved.

Here are links to MSDN for these statistics commands:

SET STATISTICS TIME

SET STATISTICS IO

Tim