views:

313

answers:

4

I have a stored procedure currently executing a complicated fetch that is frequently timing out when used. The proposed solution in my department has been to simply increase the timeout time length; which I don't really want to do. I'd like to refactor this sproc, but because it's so complicated and undocumented (yay legacy systems) I'm concerned my refactoring my not result in the same functionality executing more efficiently. Are there any strategies to use when refactoring a stored procedure to ensure the same functions are being performed in less time?

This is a Microsoft SQL Server 2005 stored procedure.

+3  A: 

Generally, time outs occur on a single SQL statement. Break up the proc into separate statements, possibly by making effective use of temp tables, so you are not trying to do too much in a single chunk. By doing this you can also hone in on your performance bottleneck and possibly identify some useful indexes if needed.

Ichorus
+3  A: 

The most common cause of inefficient stored procedures I've encountered is a prevalence of scalar-type operations as opposed to set-based operations. Most RDBMS systems (Oracle, SQL Server, MySQL, etc) are much more efficient at performing work on large sets of data, not single operations repeated multiple times. It is more efficient to perform one operation on a million rows of data, than perform an operation a million times on each row.

After trying to determine these types of bottlenecks (typically look at function calls first) then I would suggest taking a look at the indexing strategy on the table(s) you're referencing. Depending on your RDBMS of choice, you might have some wizard-type functionality that could help you discover a proper index structure based on a sample workload.

What database are you using? That might help to fine-tune some of my suggestions.

Gunny
I added the server specification to the question itself. Thanks for responding.
ddc0660
+4  A: 

I have been faced with this situation in the past. The best thing to do is create a simple C# or VB .Net application. When you refactor the sp, give it a new name. Use the application to call both the old and new sp's. Then compare the output of the two sp's to ensure they return the exact same values in the same order.

You would want to test as wide a variety of input parameters as you can to ensure your refactoring hasn't modified the business logic.

Also, using NUnit can help simplify this task.

When I started my current position, I was given a database that had to be modified for a new schema. It required changing over 100 sp's. Using the application I've described, I was able to be reasonably sure that one of my modifications did not break the business rules.

You're right, just increasing the timeout is the wrong first answer. Improve the sp as best you can, then increase the timeout if necessary.

Dave_H
+2  A: 

Use the SQL Server Profiler to study how the current SP runs; it will highlight inefficiencies and allow you to target just those specific areas to begin with, whilst leaving the more performant bits alone. You can then use the profiler again on your revised SP to compare performance.

I would echo Gunny's recommendation to take a good look at function calls - in set based operations these can have a real impact on performance. I've achieved massive performance gains in the past just by stripping out a single UDF and replicating the logic inline.

Luke Bennett