views:

32

answers:

3

Hi, I've got a stored proc that after a period of weeks will start to run very slowly. It starts its life executing in a few seconds and ends up taking a couple of minutes to execute.

We've found that dropping and recreating the procedure results in it executing in a few seconds again.

The procedure itself contains a few inner joins and a couple of left outer joins - but nothing out of the ordinary.

Why should it be slowing down so dramatically and what should we be doing to prevent this issue happening in the first place?

Many thanks.

A: 

It sounds like the execution plan is out of whack. You don't have to drop and recreate it. You should be able to call sp_recompile and have it recompile on its next execution. When it does that, it will build a new execution plan. The other thing I would check is that the statistics on the various tables are kept current and are current prior to calling sp_recompile.

Thomas
+1  A: 

SQL Server has something that called 'parameter sniffing'. Basically, the first time you ran your sp the execution plan in created base on the values that you passed to the SP. You run it again with the same values - it's fast, you run it with different values it might become really slow if these values are 'bad' for the execution plan initially generated.

To avoid parameter sniffing you can declare one local variable per parameter in your stored procedure and assign the parameters to the local variables. Then in the stored procedure code use only the variables and never the parameters. This way execution plan won't be based on the values you used on the first run.

You also can find a lot of good articles on the topic if you google 'sql parameter sniffing'.

zespri
A: 

Further to the answers provided so far, your execution plan will rely upon the statistics in the database. If these are not current then the execution plan could not be optimal. You can udpate the stats using:

EXEC sp_updatestats
ck