views:

69

answers:

6

Hi, all.

I'm currently working on a web project, in which we all connect to one development database.

And like other centralized system, in time this database turns out to be a single point of failure.

If one of the developers carelessly dump some dirty data into the database, all the other developers suffer from it.

So I think maybe we should do something, say, everyone of us make a copy of the original database and we set up our web application to connect to the local database.

Please give me some suggestion, thanks in advance.

EDIT: In my case, core members of the team are 5 developer, 1 tester(mainly black-box testing). And the development process goes on like this, each developer is responsible for one sub-feature and works on his own branch, and then we merge his branch on a trunk on which the tester test the application.

+2  A: 

The problem with having everybody develop from their own separate copy of the database is that they won't pick up the other developers' changes.

For example, if someone adds a column to the database table, the other developers won't be aware of that. Someone else may also add the same column inadvertently.

And, if someone alters a stored procedure in a way that requires a change in the app code (for example, to add an input parameter), the other developers won't know that. If they get updated code from source control, it won't work on their local database copy.

I agree that having an increasingly muddied-up development database is a problem. But I am not sure that developing from multiple copies of the database is going to reduce the number of development problems.

One alternative, which I recognize may not be available to you, is to periodically copy the production database to development. Generally, this can only occur after a new release which makes whatever changes are needed in the production database schema. But you have to have a production database to do this.

DOK
Personally, I've found it easier to synchronize developer changes -- you do need to talk about them so they don't conflict -- along with code changes than to try and have multiple people working with unsynchronized code against a single instance. Forcing code/db synch at a larger level of granularity seems to allow each person to move faster than if they have to always keep updating their local code copies to match the single DB. Note that I'm talking about integrating bi-weekly, per iteration (at worst) rather than daily. I don't let the changes drift too far.
tvanfosson
I agree with you on the downside of having a local copy of the database. Maybe this is the kind of problem that could only be solved by focusing on the problem itself instead of searching for, well, best practice.
Satoru.Logic
Red Gate (who I work for) will be solving this very problem with a product called SQL Source Control (http://www.red-gate.com/Products/SQL_Source_Control/index.htm). This won't be out for a few months but in the meantime you can synchronize developer databases with SQL Compare.
David Atkinson
A: 

On my projects the development database is always on the developers local machine. We use either SQL Server Developer Edition or SQL Server Express. We keep a SQL script for creating the checked in DB in the source code repository. Anyone needing to recreate their local DB can use this. One team member is responsible for maintaining the SQL script and any database changes go to him first (as SQL scripts, typically). He gets the most recent version of the script from SCM, updates his local copy and generates an updated creation script which replaces the script in SCM. At the same time accompanying code changes are checked into SCM so that the code and the DB are in sync. Everyone else synchronizes to this version.

This gives people the freedom to work in parallel and make schema changes as needed, including experimental changes that may be dropped. We also use the local DB as the source for dummy data for local testing of the web app -- not unit testing, we generally mock the DB out for that, but integration and UI testing. Keeping the local databases separate allows each dev to set it up for their testing as needed without any coordination with others.

I should also mention that we use Red Gate's SQL Compare to propagate just the changes around once the coordinator's local DB is at the official checked in version. This is an alternative to dropping and re-creating the DB and works a lot better to preserve the existing data. Depending on the DB changes this can either be a trivial or somewhat complex operation. We always use it to update the QA/Test and Production DBs unless the changes are so trivial that they can be made by hand (without mistakes).

tvanfosson
I should qualify this to say that this has always been on a relatively small team. I would think that it would scale or be adaptable to larger teams (teams of teams?), but I don't have any direct experience with that.
tvanfosson
A: 

What about personal local SQLite database. Many Rails Developer are happy with that solution.

zzeroo
One of my collegues came up with this suggestion too :)
Satoru.Logic
As I said, a lot of Rails developers are very happy with that solution. You could for example have a database for testing and another for development. And mount these databases, even under version control.
zzeroo
+1  A: 

A solution that works very well for my company is to run a database in a virtual machine for each developer.

We have set up one virtual machine for each database we support (oracle, db2, mssql, mysql). Now every developer can simply copy and run the virtual machine locally without having to install it.

tangens
+1  A: 

I've found it very helpful to take the time to set up a system where each developer has his own database, which gets flushed, rebuilt and filled with test data every time they run their unit tests. In this way, you can never be in each-other's way. Of course a continuous integration and test server should also have their own databases.

As long as the DDL and test-data are in version control, every one is working against the same database. Another advantage is that if I'm working on a feature that requires a DB change, everyone gets both the code and the DDL + test-data required for that change.

In the Java sphere, DbUnit, in our case together with a Hibernate Maven plugin, is very helpful for that. Of course, a simple homebrew solution may do fine.

Confusion
A: 

We use about the same basic process that other people have described here, with some significant changes. Each developer normally has their own db instance and we manage it with change scripts. Here are the details:

  1. We have a base database script to create the initial db. This includes creating a table in the db that we use to keep track of the schema version of the db.
  2. All SPs, views, and functions are scripted into individual files.
  3. If you need to make a schema change, you write a script that does that change. It must check the schema version table to make sure that the db is at the right version to apply this schema change. Things like the Red-Gate tools help greatly to write these scripts, but they are not essential.
  4. We have a small program that we wrote that automatically runs all of these scripts against a database. It will check the current schema version of the db and only run new schema change scripts. It will always run all of the SP, view, etc scripts.

It gets a bit sloppy at points because the schema change scripts must have their version number encoded in the file name and you can have conflicts when two developers both create the same version script. And there are times when you can have a db that is in an inconsistent state and the automated process will fail. But overall it has worked very well for us.

jeffesp