views:

99

answers:

4

Say there is a database with 100+ tables and a major feature is added, which requires 20 of existing tables to be modified and 30 more added. The changes were done over a long time (6 months) by multiple developers on the development database. Let's assume the changes do not make any existing production data invalid (e.g. there are default values/nulls allowed on added columns, there are no new relations or constraints that could not be fulfilled).

What is the easiest way to publish these changes in schema to the production database? Preferably, without shutting the database down for an extended amount of time.

+5  A: 

Write a T-SQL script that performs the needed changes. Test it on a copy of your production database (restore from a recent backup to get the copy). Fix the inevitable mistakes that the test will discover. Repeat until script works perfectly.

Then, when it's time for the actual migration: lock the DB so only admins can log in. Take a backup. Run the script. Verify results. Put DB back online.

The longest part will be the backup, but you'd be crazy not to do it. You should know how long backups take, the overall process won't take much longer than that, so that's how long your downtime will need to be. The middle of the night works well for most businesses.

Donnie
That means that the middle of the night, right after the normal backup, might be an ideal time to do the conversion. That way the conversion-backup is not in addition to the normal backup.
MJB
@MJB - As long as it's after a full backup and not just a tlog backup.
Donnie
And I agree with the other posts suggesting SQL compare to generate the script. This can potentially save you a lot of time, but still make sure you test the generated script at least once.
Donnie
+1: I agree, and add that these scripts should be created by the dev team, and exist in your source code repository.
OMG Ponies
+1  A: 

I've been using dbdeploy successfully for a couple of years now. It allows your devs to create small sql change deltas which can then be applied against your database. The changes are tracked by a changelog table within database so that it knows what to apply.

http://dbdeploy.com/

James O'Sullivan
+1 for dbdeploy. I use it too - simple but effective - and it's great to have your schema changes in source control.
Tom
+2  A: 

Use a tool to create a diff script and run it during a maintenance window. I use RedGate SQL Compare for this and have been very happy with it.

JohnFx
+1 for Red-Gate tools - excellent stuff!
marc_s
But use carefully, there may be objects you do not want to move to prod in this version. YOu may also need to adjust the order of the change scripts.
HLGEM
+2  A: 

There is no generic answer on how to make 'changes' without downtime. The answer really depends from case to case, based on exactly what are the changes. Some changes have no impact on down time (eg. adding new tables), some changes have minimal impact (eg. adding columns to existing tables with no data size change, like a new nullable column that doe snot increase the null bitmap size) and other changes will wreck havoc on down time (any operation that will change data size will force and index rebuild and lock the table for the duration). Some changes are impossible to apply without *significant * downtime. I know of cases when the changes were applies in parallel: a copy of the database is created, replication is set up to keep it current, then the copy is changed and kept in sync, finally operations are moved to the modified copy that becomes the master database. There is a presentation at PASS 2009 given by Michelle Ufford that mentions how godaddy gone through such a change that lasted weeks.

But, at a lesser scale, you must apply the changes through a well tested script, and measure the impact on the test evaluation.

But the real question is: is this going to be the last changes you ever make to the schema? Finally, you have discovered the perfect schema for the application and the production database will never change? Congratulation, once you pull this off, you can go to rest. But realistically, you will face the very same problem in 6 months. the real problem is your development process, with developers and making changes from SSMS or from VS Server Explored straight into the database. Your development process must make a conscious effort to adopt a schema change strategy based on versioning and T-SQL scripts, like the one described in Version Control and your Database.

Remus Rusanu
Thanks for the exhaustive answer - I agree that there needs to be a process to make sure that developers are not making changes ad hoc - funnily enough, clicking on your last link displays "Error establishing a database connection"
Marek
'Error establishing a database connection': all I can say is that hostmonster.com hosting quality is well below par
Remus Rusanu