views:

41

answers:

3

Especially if I'm using GitHub? If not, should I (or perhaps someone else) work on such a tool?

+2  A: 

I think the current best practice is to pull production-data from the production-system into a developer-database and test with this. Distributing data over VCS bears the risk that this data differs to much from real data.

If you mean to just distribute the current database-schema: Frameworks such as RoR have migrations for that which update the database schema incrementally.

Anyway, it's just a matter of putting the dump-file into VCS and importing it every time you pull a new version, so I don't really see the need of a tool for that.

Sven Koschnicke
there are tons of frameworks though, that unfortunately do not enforce migrations for databases (wordpress or joomla, for example) so there are many cases when it is nice to have a versioned copy, of the schema, not the data, to replicate the database quickly on another machine.
Jed Schneider
are there any PHP frameworks that update the database schema?
Zachary Burt
A: 

The output of mysqldump is a simple text file.

Each time you change your database, call mysqldump to the same file. Then commit it to your repository as you commit your code.

Alex
A: 

add a script that dumps the sql file someplace specific to your project (but inside the project directory).

mysqldump -uroot -p my_db > my_project/my_db.sql

it is juts text so git will pick that file up as changed just like any other. so then just

git add my_db.sql
git commit -m "the schema changed"
git push my_project github master

you may consider the following:

  • putting your dump file on github or adding it to the repo means any data there is available for anyone to see, like passwords and usernames so obviously, use fake data.
  • you may want to look at the mysqldump options to only dump the schema and not the data http://dev.mysql.com/doc/refman/5.1/en/mysqldump.html
Jed Schneider