views:

152

answers:

5

For the last few years I was the only developer that handled the databases we created for our web projects. That meant that I got full control of version management. I can't keep up with doing all the database work anymore and I want to bring some other developers into the cycle.

We use Tortoise SVN and store all repositories on a dedicated server in-house. Some clients require us not to have their real data on our office servers so we only keep scripts that can generate the structure of their database along with scripts to create useful fake data. Other times our clients want us to have their most up to date information on our development machines.

So what workflow do larger development teams use to handle version management and sharing of databases. Most developers prefer to deploy the database to an instance of Sql Server on their development machine. Should we

  1. Keep the scripts for each database in SVN and make developers export new scripts if they make even minor changes
  2. Detach databases after changes have been made and commit MDF file to SVN
  3. Put all development copies on a server on the in-house network and force developers to connect via remote desktop to make modifications
  4. Some other option I haven't thought of
A: 

Have you looked at a product called DB Ghost? I have not personally used it but it looks comprehensive and may offer an alternative as part point 4 in your question.

Kane
+2  A: 

Option (1). Each developer can have their own up to date local copy of the DB. (Up to date meaning, recreated from latest version controlled scripts (base + incremental changes + base data + run data). In order to make this work you should have the ability to 'one-click' deploy any database locally.

Mitch Wheat
A: 

In a previous company (which used Agile in monthly iterations), .sql files were checked into version control, and (an optional) part of the full build process was to rebuild the database from production then apply each .sql file in order.

At the end of the iteration, the .sql instructions were merged into the script that creates the production build of the database, and the script files moved out. So you're only applying updates from the current iteration, not going back til the beginning of the project.

dj_segfault
+1  A: 

You really cannot go wrong with a tool like Visual Studio Database Edition. This is a version of VS that manages database schemas and much more, including deployments (updates) to target server(s).

VSDE integrates with TFS so all your database schema is under TFS version control. This becomes the "source of truth" for your schema management.

Typically developers will work against a local development database, and keep its schema up to date by synchronizing it with the schema in the VSDE project. Then, when the developer is satisfied with his/her changes, they are checked into TFS, and a build and then deployment can be done.

VSDE also supports refactoring, schema compares, data compares, test data generation and more. It's a great tool, and we use it to manage our schemas.

Randy Minder
+4  A: 

Never have an MDF file in the development source tree. MDFs are a result of deploying an application, not part of the application sources. Thinking at the database in terms of development source is a short-cut to hell.

All the development deliverables should be scripts that deploy or upgrade the database. Any change, no matter how small, takes the form of a script. Some recommend using diff tools, but I think they are a rat hole. I champion version the database metadata and having scripts to upgrade from version N to version N+1. At deployment the application can check the current deployed version, and it then runs all the upgrade scripts that bring the version to current. There is no script to deploy straight the current version, a new deployment deploys first v0 of the database, it then goes through all version upgrades, including dropping object that are no longer used. While this may sound a bit extreme, this is exactly how SQL Server itself keeps track of the various changes occurring in the database between releases.

As simple text scripts, all the database upgrade scripts are stored in version control just like any other sources, with tracking of changes, diff-ing and check-in reviews.

For a more detailed discussion and some examples, see Version Control and your Database.

Remus Rusanu
+1: I have seen SQL Server Version control implemented very badly in a large number of different manners. For me personally, this is the method that I also consider to be the most effective.
John Sansom
Totally agree. But the more complex the system (databases, environments, backup plans, SQL Agent jobs, etc. etc. etc.), the more complex the database versioning system has to be. And when you factor in how to provide developers with "realistic data" against which to develop new code, and how to keep that data "fresh" over time and versions, in a large organization for *any* value of large, it becomes a very complex problem indeed.
Philip Kelley
@Philip: true, it can get hairy. One thing to keep in mind though is the necessary separation of duties between development and deployment. Physical database layout (assignmnet of files to disks), maintenance plans and many Agent jobs are deploymnet duties that fall into the realm of DBAs, not developers. Those are usualy not scripted into apps because they are part of the larger plan of the DBA team of 'how do we administer 1000 servers'.
Remus Rusanu