views:

243

answers:

7

When we first started source control the developers would just edit the scripts in the database and right before the release one big script of all the changes would be made. This worked great until one of the devs accidently deleted a stored procedure and all the work was lost.

After that we put all the scripts to create the stored procedures in text files and stored them in source control. The problem here is that the developers sometimes update the stored procedure in source control or the database and forget to update the other one.

My dream is to have a system where a dev goes in and checks out a stored procedure. Then after changes are made the database is automatically updated.

Is this just a dream? What is the best way to source control SQL Server?

+1  A: 

Hi,

The way I've seen source control work best for SQL Server in a team environment was when the DBA did regular builds of the database using checked-in code. It generally only takes a single instance of losing something because it wasn't checked-in before developers get the picture that checking-in their code means something.

Hope this helps,

Bill

Bill Mueller
+6  A: 

We have been using Visual Studio Team System Database Edition recently, and I have to say it has worked very well. All stored procedures are stored as files, and checked in and out of source control, and it has tools to generate scripts, etc.

Also, in the past we've used scripts stored as text files that are checked in and out of source control. The rule was that you had to check out the file, then edit it in, for instance, Management Studio, and save it, and check it back in. At the top of every stored procedure script file it would drop the existing stored proc, and then use the CREATE statement to create a new one (gets around the CREATE/ALTER problem). We then had a tool that would run all the scripts in the right order to build a blank database from scratch, and then we used RedGate's SQL Compare product to generate a script to bring our existing databases up to date. I admit it was tedious.

Around the same time I worked on a system with about 10 other developers, and they implemented a rigorous database change management procedure. There were many, many applications that all depended on a set of 2 or 3 databases. Any time a database schema had to change (we're only talking tables, columns and views here) then a document was created that explained the change, and then there was a matrix that listed the changes vs. what applications we thought it would impact. Then the document was circulated and had to be reviewed by someone responsible for each application, and they had to search through their application for anywhere it might be impacted, etc. It was a long arduous procedure, but it worked. However, stored procs were just stored as text files in source control.

In the more distant past, with smaller projects that were more like desktop apps with a database as the datastore, every time the app started, I would:

  • Check to see if the database existed, and if not, create it
  • Check that all the tables existed, and if not, create them
  • Check that all the columns existed, and if not, add them

Whenever I needed to change the schema, I would just add more code to the end of the startup code to modify the schema as necessary, taking care to migrate any existing data. The benefit of this was that you could just uninstall and reinstall a new version of the software, and it would automatically upgrade the existing database to the latest version. Installation, upgrades, and maintenance was a dream. That wouldn't work for more "enterprisey" systems though.

You can reduce some of these problems by adopting ADO.Net Entities or another similar Entity Framework, like Entity Spaces. These are object-relational mapping layers. They auto-generate classes for each entity (table) in your database, including properties for each column, etc. Then they allow you to extend those classes with custom logic. If you can get away from having your business logic in stored procedures, and put them in the Entity classes, then the benefit is that they're strongly typed. Therefore if you change the name of a column, or delete a column and you regenerate your entity classes, then your IDE or compiler will automatically flag all the places where the code is broken. Obviously, all the entity code is naturally in source control with the rest of your source code too.

Scott Whitlock
Database Edition, specially the new builds coming out recently, is a great tool not only for version control, but also to flag bad practices, to help with schema changes and refactoring and not least to help with the deployment process on end-user site.
Remus Rusanu
@Remus - yes, and I was impressed that it finds errors like old stored procs that are referencing a non-existent column in some long forgotten table, etc. It really helped clean things up.
Scott Whitlock
A: 

"This worked great until one of the devs accidently deleted a stored procedure and all the work was lost". Whatever source control you are using, you can get any old version and retrieve the deleted procedure no problem.

AlexKuznetsov
A: 

I've worked in an environment where source control was part of the release procedure.

DBAs were given release notes that required the DBA to get from source control and then release stored procedure changes or SQL scripts from there. If you can get the DBAs on side then this is a good way of avoiding aborted releases, as you should be able to pretest the SQL on a UAT system.

If the data is not released into source control then it doesn't get released.

Integration branches were used to release the code.

Simon
A: 

We have always used the scptxfr utility that comes with SQL Server 2000 to script the database out to a file that is stored under source control.

We run it before doing a check in and it will highlight any chances that have occurred (whether expected or not). It does not come with 2005 or later, but if you have any old 2000 install it still works against the newer versions. It can have problems with complex schemas but it is a good starting place. It could also be made an automatic process when combined with source control triggers or Continuous Integration.

Dave Glassborow
A: 

The key is to limit right to prod to only a few individuals and to insist that they do not ever make changes except by calling up a script from source control. In newere versions of SQl Server you can also set up DDL logging to find out exactly who changed that table to a version not in Source Control!

When we first made the switch to using source control on our database everyone had prod rights (something we have since fixed), so we had a dba periodically do compare of source control check ins with the actual prod database and get rid of any unauthorized changes. It only took doing it once to convince the developers that they had to use source control.

HLGEM
+2  A: 

Red Gate is mid-way through developing a tool that fully integrates source control to SQL Server Management Studio. This effectively links your development database(s) to your existing source control system allowing commiting of changes and retrieving other developers' changes at the click of a button. We plan to support TFS and SVN from the outset, but VSS may follow if there is sufficient demand. We have an early access program for those who want to be sent news of our progress and any early builds to try out. More information on the tool and how to sign up can be found at:

http://www.red-gate.com/products/SQL_Source_Control/

Best wishes,

David Atkinson, Product Manager, Red Gate Software

David Atkinson