views:

45

answers:

2

I need to restore a backup from a production database and then automatically reapply SQL scripts (e.g. ALTER TABLE, INSERT, etc) to bring that db schema back to what was under development.

There will be lots of scripts, from a handful of different developers. They won't all be in the same directory.

My current plan is to list the scripts with the full filesystem path in table in a psuedo-system database. Then create a stored procedure in this database which will first run RESTORE DATABASE and then run a cursor over the list of scripts, creating a command string for SQLCMD for each script, and then executing that SQLCMD string for each script using xp_cmdshell.

The sequence of cursor->sqlstring->xp_cmdshell->sqlcmd feels clumsy to me. Also, it requires turning on xp_cmdshell.

I can't be the only one who has done something like this. Is there a cleaner way to run a set of scripts that are scattered around the filesystem on the server? Especially, a way that doesn't require xp_cmdshell?

+1  A: 

First off and A-number-one, collect all the database scripts in one central location. Some form of Source Control or Version Control is best, as you can then see who modified what when and (using diff tools if nothing else) why. Leaving the code used to create your databases hither and yon about your network could be a recipe for disaster.

Second off, you need to run scripts against your database. That means you need someone or something to run them, which means executing code. If you're performing this code execution from within SQL Server, you pretty much are going to end up using xp_cmdshell. The alternative? Use something else that can run scripts against databases.

My current solution to this kind of problem is to store the scripts in text (.sql) files, store the files in source control, and keep careful track of the order in which they are to be executed (for example, CREATE TABLEs get run before ALTER TABLEs that add subsequent columns). I then have a batch file--yeah, I've been around for a while, you could do this in most any language--to call SQLCMD (we're on SQL 2005, I used to use osql) and run these scripts against the necessary database(s).

If you don't want to try and "roll your own", there may be more formal tools out there to help manage this process.

Philip Kelley
I agree about version control, but that's another story and not really my decision.[If you're performing this code execution from within SQL Server, you pretty much are going to end up using xp_cmdshell.]I can do that and don't really mind. Didn't want to miss something cleaner or cleverer that I just didn't know about.
A: 

Beyond the suggestions about centralization and source control made by Phillip Kelley, if you are familiar with .NET, you might consider writing a small WinForms or WebForms app that uses the SQL Server SMO (SQL Server Management Objects). With it, you can pass an entire script to the database just as if you had droppped it into Management Studio. That avoids the need for xp_cmdshell and sqlcmd. Another option would be to create a DTS/SSIS package that would read the files and use the Execute T-SQL task in a loop.

Thomas
The idea of a little WinForms app had occurred to me also. Don't know SSIS, wasn't sure if it would read files or not, good to know that it will. Thanks for the reply, sorry I can't give you a vote up yet.