views:

1447

answers:

5

I've recently given it a try (SQL 2008 version) and I find them quite okay. Well, the only problem is that the wizard is not intelligent enough to update database structure without first erasing all data. Is this why these projects are not used in practice?

Actually never seen anyone using them or ever mentioning them at all. Also nothing to be seen in blogs and forums unless you explicitly look for it.

What's wrong with them?

+5  A: 

I use the database project that is part of Visual Studio Database Edition. This is a great tool. Basically you define the entire schema, in create scripts which are then checked into source control. It then has tools built in to generate difference scripts, which by the way do not delete the data.

It also has data compare tools so you can compare data between database and generates the script to make the databases the same.

The recent GDR release has some interesting features added to it. Such that it sounds like if you use their built in deployment method you can generate a deployment package which when you run it will analyze the target database and apply only the differences.

If you have Team Studio - Team Suite or Development edition then you can use the Database Edition.

Give it a try and is a great evolution in database development

JoshBerke
+5  A: 

We use them. We keep all of our schema creates/update scripts and Stored Procedures. The main purpose is that we can connect the project to a SourceSafe or SVN.

Its an easy way to keep your SQL scripts version controlled.

It is a little quirky trying to do some SQL testing in VS, but you find ways around it.

Update

We actually have it built into our deployment scripts, our deploy tool, goes through the DB project (except for flagged folders) and runs all the script. We just built a quick tool for run the project. If anyone has other solutions to deploy out the DB project that would be helpful.

Glennular
+4  A: 

Like Glennular, we are using them to version control our schema and s'procs.

Although we have a fairly advanced version control structure (CI, auto deployments to dev, single click deployments to stage and prod); we don't include any of the DB projects in that structure. We just don't trust it yet.

UPDATE: (for Out In Space)

We have separate TFS projects for functional areas of the company (Sales, Marketing, etc). Within each TFS project we have a Main and Production folder. We also have one TFS project which contains the Database projects and another one that contains Common assemblies / visual studio projects.

Upon release, we branch from Main to Production. We don't have a staging branch as we move too fast to deal with that. Right or wrong, our productivity is measured in part by the number of production level releases we do per week; bug fixes, new features, etc.

CI is set up on the Main branch such that every check in causes the Build server to deploy to our DEV environments. Unit and Web tests are then run and the build quality is automatically set to "Development" if it completes successfully. When someone changes the Build Quality to "In Staging" This causes any previous "In Staging" builds to be set to "Rejected" and causes that build to be pushed to our staging servers while updating the config files to point to the correct servers. (I used TFS Deployer and PowerShell scripts for this).

QA does testing out of our staging servers. Once they are happy, the production team changes the Build Quality to "Production". This causes the build to be sent to a Production area which then is manually copied to the correct location. Once completed, production notifies development who then branches that version into the Production folder. QA is also notified who then does a battery of Production tests to verify everything is indeed working as expected.

We have reports set up to show us what changes exist between production releases so that we know every check in that is being deployed. That prevents unknowns from popping up such as a database change etc. or some other potentially breaking code.

Further, our BA's are tracking work items via Team System Web Access and know when those items are in production.

Although our DBA's are using Database Edition (GDR), they haven't been impressed with the level of control for auto deployments. I'm hoping that Rosario brings some better deployment control to the product line; but until then we have TFS Deployer and powershell.

Chris Lively
Can you expand on your version control structure a bit?
Robert S.
+1  A: 

We use the database project to provide version control for our SQL scripts. We also like using the Visual Studio environment to edit the SQL; it's a little bit easier to use for some of our newer developers than query analyzer.

Robert S.
A: 

I've used them on a few paid projects and I think its a great tool. Thats said, I've seen some issues.

  1. If the .dat file in your db project folder gets out of sync with the temporary instance of the database, schema compare will give inaccurate results. Not sure how this happens, review schema compares carefully and blow away your .dat file (after closing solution) if things seem wrong.

  2. If you have 20+ databases and they reference each other and use circular references... Its going to hurt. I haven't figured out how to make it scale to that scenario. GDR 2 seems to offer some promise.

Frank Schwieterman