views:

235

answers:

4

Our current scenario is like this:

  • we have an existing database that needs to be updated for each new release that we install
  • we have to do this from individual SQL scripts (we can't use DB compare/diff tools)
  • the installation should run the SQL scripts as automatically as possible
  • the installation should only run those SQL scripts, that haven't been run before
  • the installation should dump a report of which script ran and which did not
  • installation is done by the customer's IT staff which is not awfully SQL knowledgeable

Whether this is a good setup or not is beyond this question - right now, take this as an absolute given - we can't change that this or next year for sure.

Right now, we're using a homegrown "DB Update Manager" to do this - it works, most of the time, but the amount of work needed to really make it totally automatic seems like too much work.

Yes, I know about SQLCMD - but that seems a bit "too basic" - or not?

Does anyone out there do the same thing? If so - how? Are you using some tool and if so - which one?

Thanks for any ideas, input, thoughts, pointers to tools or methods you use!

Marc

+1  A: 

I have a similar setup to this and this is my solution:

Have a dbVersion table that stores a version number and a datetime stamp. Have a fodler where scripts are stored with a numbering system, e.g. x[000] Have a console / GUI app that runs as part of the installation and compares the dbVersion number with the numbers of the files. Run each new file in order, in a transaction.

This has worked for us for quite a while.

Part of our GUI app allows the user to choose which database to update, then a certain string #dbname# in the script is replaced by the database name they choose.

ck
That's pretty much our setup right now, yes - the cost and effort to do upkeep on our custom DB Update Manager just seems outrageous - any off the shelf would be preferred. Thanks!
marc_s
+2  A: 

I have a similar situation. We maintain the database object scripts in version control. For a relesase the appropriate versions are tagged and pulled from version control. A custom script concatenates the invidual object scripts into a set of Create_DB, Ceate_DB_Tables, CreateDB Procs, ... At a prior job I used manually crafted batch files and OSQL to run the database create/update scripts.

In my current position we have an InstallSheild set up with a custom "Install Helper" written in C++ to invoke the database scripts using SqlCmd.

Also, like CK, we have SchemaVersion table in each database. The table contains both app and database version information. The schema verison is just an integer that gets incremanted with each release.

Sounds complicated but it works pretty well.

+1  A: 

You might try Wizardby: it allows you to specify database changes incrementally and it will apply these changes in a very controlled manner. You'll have to write an MDL file and distribute it with your application along with Wizardby binaries, and while installing setup will check whether database version is up to date and if not it will apply all necessary changes in a transaction.

Internally it maintains a SchemaInfo table, which tracks which migrations (versions) were applied to a particular instance of the database, so it can reliably run only required ones.

Anton Gogolev
Looks very interesting, downsize is: Yet Another separate language to learn......
marc_s
True, but I guess some kind of fluent API can be added very easily so that migrations will be expressed in C# rather than in MDL.
Anton Gogolev
A: 

If you maintain your changes in source control (e.g. SVN) you could use a batch file with SQLCMD to deploy only the latest changes from a particular SVN branch.

For example,

rem  Code Changes
sqlcmd -U username -P password -S %1 -d DatabaseName    -i   "..\relativePath\dbo.ObjectName.sql"

Say you maintain an SVN branch specifically for deployment. You would commit your code to that branch, then execute a batch file which would deploy the desired objects.

Drawbacks include inability to check on the fly for table-related changes (e.g. if one of your SQL scripts happens to involve adding a column; if you tried to rerun the script it wouldn't be smart enough to see the column if it were added in a previous run). To mitigate that, you could build an app to build the batch file for you and create some logic to interact with the destination database, check for changes that have or haven't already been applied, and act accordingly.

Darth Continent