views:

298

answers:

5

Is there a way to re-compile or at least 'check compile' stored procedures en masse? Sometimes we'll make schema changes - add or drop a column, etc. And do our best to identify affected procs only to be bitten by one we missed, which pukes when it runs next. SQLServer 2k5 or 2k8.

+1  A: 

Just iterate through them, after getting the list from sysobjects, and run sp_recompile:

Here is a link showing a sample script:
http://database.ittoolbox.com/groups/technical-functional/sql-server-l/recompile-all-stored-procedures-2764478

marcc
It is easier to use "sp_recompile tablename", as it will mark for recompile all procedures that refer to the table. However, this will not work for the OP, as sp_recompile only marks the procedures for recompile. They are actually compiled the next time they are run, which is same time when the OPs changes are "puking" at run time.
KM
+1  A: 

You may be able to use DBCC FREEPROCCACHE

http://msdn.microsoft.com/en-us/library/ms174283.aspx

Shiraz Bhaiji
+1  A: 

If you have problems changing a table and breaking stored procedures try sp_depends:

sp_depends [ @objname = ] '<object>' 

<object> ::=
{
    [ database_name. [ schema_name ] . | schema_name.
        object_name
}

and identity them before they break. Use it this way:

EXECUTE sp_depends  YourChangedTableName

Also, you could use sp_recompile:

EXEC sp_recompile YourChangedTable

but that only marks associated stored procedures for recompile when they are run the next time.

You could use management studio or your source control to generate a concatenated create script of all procedures into a single file and then run that.

KM
+1  A: 

I understand your question as 'when I make a schema change, I want to validate all procedures that they still execute correctly with the new schema'. Ie. if you drop a column that is referenced in a SELECT in a procedure, then you want it flagged as it requires changes. So specifically I do not understand your question as 'I want the procedure to recompile on next execution', since that job is taken care of for you by the engine, which will detect the metadata version change associated with any schema alteration and discard the existing cached execution plans.

My first observation is that what you describe in your question is usually the job of a TEST and you should have a QA step in your deployment process that validates the new 'build'. The best solution you could have is to implement a minimal set of unit tests that, at the very least, iterates through all your stored procedures and validates the execution of each for correctness, in a test deployment. That would pretty much eliminate all surprises, at least eliminate them where it hurts (in production, or at customer site).

Your next best option is to rely on your development tools to track these dependencies. The Visual Studio Database 2008 Database Edition provides such functionality out-of-the box and it will take care of validating any change you make in the schema.

And finally your last option is to do something similar to what KM suggested: automate an iteration through all your procedures depending on the modified object (and all procedures depending on the dependent ones and so on and so forth recursively). It won't suffice to mark the procedures for recompilation, what you really need is to run the ALTER PROCEDURE to trigger a parsing o its text and a validation of the schema (things are a bit different in T-SQL vs. your usual language compile/execute cycle, the 'compilation' per se occurs only when the procedure is actually executed). You can start by iterating through the sys.sql_dependencies to find all dependencies of your altered object, and also find the 'module definition' of the dependencies from sys.sql_modules:

with cte_dep as (
   select object_id
      from sys.sql_dependencies
    where referenced_major_id = object_id('<your altered object name>') 
    union all
    select d.object_id
    from sys.sql_dependencies d
     join cte_dep r on d.referenced_major_id = r.object_id
    )
, cte_distinct as (
    select distinct object_id
     from cte_dep)
select object_name(c.object_id)
    , c.object_id 
    , m.definition
    from cte_distinct c
    join sys.sql_modules m on c.object_id = m.object_id

You can then run through the dependent 'modules' and re-create them (ie. drop them and run the code in the 'definition'). Note that a 'module' is more generic than a stored procedure and covers also views, triggers, functions, rules, defaults and replication filters. Encrypted 'modules' will not have definition the definition available and to be absolutely correct you must also account for the various settings captured in sys.sql_modules (ansi nulls, schema binding, execute as clauses etc).

If you use ynamic SQL, that cannot be verified. It will not be captured by sys.sql_dependencies, nor it will be validated by 're-creating' the module.

Overall I think your best option, by a large margin, is to implement the unit tests validation.

Remus Rusanu
+1  A: 

I know what you mean and in many scenario's in recognize your need. You might have a look at sp_refreshsqlmodule.

Good luck, Ron

This is excellent - thank you!
n8wrl