tags:

views:

237

answers:

5

Hi

This is probably the most classic database problem.

I have an E-commerce software solution hosted on a SQL server for data, and a web server for the frontend. Every instance/customer has its own database on SQL Server 2008.

During development of the next version, I might change or add tables, views, stored procedures etc.

How do I publish this change to all databases, without loosing data? It should be done via a script or something similar. Centralized management is the key...

A: 

A script should be executed on sql server machine by db admin.

Main algorithm of such scipt is about to create backup, lock table each table in loop, alter it, release.

abatishchev
+2  A: 

Perhaps it's something you've already considered, but my company uses software from Red Gate (http://www.red-gate.com/) which compares our development version of the DB and the production one, generates and executes scripts to bring production on par with development.

(I'm not a sales person from Red Gate, but I think this might be what you're looking for)

Tuxified
We use this software as well, and it is excellent at its job. Again, not a Red Gate sales person, just a very happy customer!
Jay S
+2  A: 

I use SQL Compare for schema changes and SQL Data Compare for data changes. Works like a charm!

Conrad
A: 

This problem is essentially one of automating the manual process of logging on to a SQL Server, and running a script against one or more databases, that does the modifications you need.

It's made worse, of course, if the instances of SQL Server that you need to update are remote from you, and therefore not directly accessible.

It's also vital to ensure that the scripts are applied in sequence - it would be no point running the "add index" script before the "create table" script.

The way we've solved this is with a web service that packages script files as datasets, and delivers them in the correct sequence to the remote systems when they call home.

On the remote SQL Server, we have a .NET application which calls the web service, downloads the script files, unpacks them and applies them to the database.

When the remote system calls in, it supplies the ID of the most recent upgrade it has. When the web service completes, it knows the last one it delivered. It's therefore trivial to know what level the remote systems are at.

The only manual intervention required is to create the scripts in the first place, and upload them to the central server.

ChrisA
How do you generate the change scripts that you upload to the central server?
MartinHN
Depends. SSMS will generate scripts for most structural changes. These get saved and stored in a table, which ends up in a dataset when packed by the web service. Other scripts, usually ones that manipulate data (usually meta data), are simply written manually and shipped the same way.
ChrisA
Do you know which table these SSMS generated scripts are saved to?
MartinHN
They're not saved to tables automatically... we manually save the scripts as files, then programmatically read them in and save the memory stream in a table. It's then easy to extract it to a dataset, and ship that via the web service.
ChrisA
A: 

Another poster mentioned the Red Gate products, and I'll throw another commercial product out there - Quest Change Director:

http://www.quest.com/change-director-for-sql-server/

Disclaimer: I work for Quest, although I'm not in sales. Change Director does comparisons, syncing, links to a change management system, can use your dev/qa server as a source or use T-SQL scripts, has an audit trail and rollback capabilities, etc.

Like you said, central management is key, and this product focuses on that.

Brent Ozar