views:

203

answers:

1

My company provides a large .NET service-oriented solution. The services layer interact with a T-SQL back-end consisting of hundreds of tables and stored procedures. Our C# code is in version-control (SVN) but our stored procedures and schema are not.

After much lobbying of expedient upper-management, I was allowed to review our (non-existent) build/deployment process to accomplish the following goals:

  1. Place schema and stored procedures under source-control.
  2. Automate the build/deployment process.

I would like to proceed per the accepted answer's strategy in this post but have additional questions:

  1. I would like to use Hudson as my build server. Is this a reasonable choice for a C#/SQL solution? What better alternatives should I explore?

  2. Assuming I have all triggers, stored-procedures, schema, etc... under source control, and that they are scripted to individual files, how do I generate a build script which will take into account dependencies/references between these items? (SQL Server does this automatically, but it generates one giant script)

  3. What does the workflow of performing an update at the client look like? i.e. I have to keep existing table data. How do I roll-back schema changes?

  4. I am the only programmer. Several other pseudo-technical staff like to make changes directly inside SQL Management Studio. Is it realistic to expect others to adhere to this solution -- how can I enforce this?

Thank you in advance for your help.

Edit:

Unfortunately we will not be able to use TFS. We do have Visual Studio 2008/2010 with the Database Project components available, though, so it looks like I'll have to hack together a script-based solution. Any suggestions/updates are appreciated..

+2  A: 

The canonical example on the Microsoft stack for T-SQL deployment is the Visual Studio Database Project deployment process. In this process, your database schema, procedures, right assignment and pretty much all else are stored as pieces of a VSDB project, which means that they are stored as SQL definition files, and checked under source control (SVN is fine). The 'build' process delivers a .dbschema file, which is a file that contains a synthesis of the entire VSDB project (is a glorified XML file). The .dbschema file is then shipped into the deployment server (development server, QA validation server, even produciton server) and 'deployed'. Deployment is done via the vsdbcmd tool which will run a sophisticated diff between the deployment server and the .dbschema file and 'align' the server to the content of the .dbschema file, using CREATE/ALTER/DROP statements as appropriate, based on what exists in the target database/server.

A contiguously integrated process would start a nightly build, drop the .dbschema along with other deliverable on the test SQL server, deploy the .dbschema, then run build validation tests, and if all good in the end will drop a fully build and QA validated deliverable, the daily 'drop'. Fully integration all the way to deployment into production is possible, but usually avoided due to risk of unexpected downtime on the central, production, server. However, fully integration and deployment into production is usually the norm for multi-server environments, where 'production' means hundreds/thousands of deployed servers.

Now you say that you want to deploy using Hudson, which is all good, except that you have to recreate everything I describe in the steps above as Ants build steps and you'll spend the next 10 years reinventing the VS DB project concepts, like a .dbschema file and a tools like vsdbcmd. I'm not the one that can make the call to invest into buying a VSDB and TFS based build server license, but I'm saying that I'm not aware of an end-to-end solution available in OSS. With VS 2010, the Database Projects are in Standard Edition, I believe. With VS 2008 you'd need the high end license.

As of users doing changes riding shot-gun from SSMS: you can prevent them using DDL triggers, you can track them using Event Notifications, or you can fully audit them using C2 compliant audit.

Remus Rusanu
"As of users doing changes riding shot-gun from SSMS: you can prevent them using DDL triggers, you can track them using Event Notifications, or you can fully audit them using C2 compliant audit."Plus you can overwrite their changes with what is in source control. Won't take long til they stop. Also take aawy their prod rights, so any changes to prod have to be deployed through the depolyment scripts.
HLGEM
The problem with running a diff between our/their dbs is that several clients actively add their own stored procedures (and sometimes even modify our own). I'd like my automation process perform a one-shot update, after which I don't guarantee anything. So it looks like I'm forced to stay within the MS toolset? The post I referenced seemed to steer away from this.. Thank you for your help.
Alex
Personally I use versioned schemas and upgrade scripts, like in I describe in http://rusanu.com/2009/05/15/version-control-and-your-database. In my products I add the SQL scripts as resources to the app, and run them according to an upgrade map (ie. run *this* script to upgrade from version v1.3 to v1.4, then run *that* script to upgrade from v1.4 to v1.5, then *that* script etc until finally you reach the current version). I prefer this over VS DB projects because I don't like how vsdbcmd and other diff based tools handle deployment of a new version, specially for large tables.
Remus Rusanu
Continuing along this route, I have run into another problem: http://stackoverflow.com/questions/3007630/sqlmetal-datacontext-associations-not-generated ....
Alex