views:

455

answers:

5

If you have a lot of Stored Procedures and you change the name of a column of a table, is there a way to check which Stored Procedures won't work any longer?


Update: I've read some of the answers and it's clear to me that there's is no easy way to do this. Would it be easier to move away from Stored Procedures?

A: 

I'm sure there are more elegant ways to address this, but if the database isn't too complex, here's a quick and dirty way:

Select all the sprocs and script to a query window.

Search for the old column name.

HectorMac
+3  A: 

There's a book-style answer to this, and a real-world answer.

First, for the book answer, you can use sp_depends to see what other stored procs reference the table (not the individual column) and then examine those to see if they reference the table:

http://msdn.microsoft.com/en-us/library/ms189487.aspx

The real-world answer, though, is that it doesn't work in a lot of cases:

  • Dynamic SQL strings: if you're building strings dynamically, either in a stored proc or in your application code, and then executing that string, SQL Server has no way of knowing what your code is doing. You may have the column name hard-coded in your code, and that'll break.
  • Embedded T-SQL code: if you've got code in your application (not in SQL Server) then nothing in the SQL Server side will detect it.

Another option is to use SQL Server Profiler to capture a trace of all activity on the server, then search through the captured queries for the field name you want. It's not a good idea on a production server, because the profile incurs some overhead, but it does work - most of the time. Where it will break is if your application does a "SELECT *", and then in your application, you're expecting a specific field name to come back as part of that result set.

You're probably beginning to get the picture that there's no simple, straightforward way to do this.

Brent Ozar
A: 

If you are only interested in finding the column usage in the stored procedure probably the best way will be do do a brute force search for the column name in the definition column sys.sql_modules table - which stores the definition for the stored procedures/functions.

no_one
+4  A: 

I'm a big fan of SysComments for this:

SELECT DISTINCT Object_Name(ID) 
FROM SysComments 
WHERE text LIKE '%Table%'
AND text LIKE '%Column%'
Meff
Well, looks like I've just learnt a new trick. Thanks!
Sören Kuklau
in 2005+ the recommended way is to use OBJECT_DEFINITION function since syscomments is deprecated
Mladen Prajdic
+2  A: 

While this will take the most work, the best way to ensure that everything works is to write integration tests.

Integration tests are just like unit tests, except in this case they would integrate with the database. It would take some effort, but you could easily write tests that exercise each stored procedure to ensure it executes w/o error.

In the simplest case it would just execute the sp and make sure there is no error and not be concerned about the actual results. If your tests just executed sp's w/o checking results you could write a lot of this genericly.

To do this you would need a database to execute against. While you could setup the database and deploy your stored procs manually, the best way would be to use continuous integration to automatically get the latest code (database DDL, stored procs, tests) from your source control system, build your database, and execute your tests. This would happen every time you committed changes to source control.

Yes it seems like a lot of work. It's a lot of work, but the payoff is also big. The ability to ensure that your changes don't break anything allows you to move your product forward faster with a better quality.

Take a look at NUnit and NDbUnit

Todd