views:

982

answers:

6

Hi,

I'm setting up a continuous integration server (Hudson) to build a Java project and run the relevant unit/integration tests. Most of these tests access a database and the test data is kept in a DbUnit XML file.

I'm looking for a way to automatically keep the test database schema up-to-date. Currently the SQL script for a particular release is stored in a directory named after the release version:

└───scripts
    ├───0.1.0
    ├───0.1.1
    ├───0.1.2
    ├───0.1.4

For example, the SQL script for version 0.1.4 is

scripts\0.1.4\script-0.1.4.sql

The problem is that these scripts contain a mixture of schema changes (e.g. ALTER TABLE...) and changes to the static tables (e.g. add a new role to the USER_TYPE table).

In the case of the unit tests I only want to apply the schema changes, because as mentioned above, all the data for the unit tests is kept in a DbUnit XML file. Although I could separate these two types of database changes into different files, there will often be a dependency between the schema changes and the data changes that will need to be somehow enforced when the release is being applied to QA, production, etc.

Anyway, this is just a very long-winded way of asking whether anyone has come up with a robust way to automatically keep their test schema up-to-date? I know Unitils has some support for keeping a test schema up-to-date, but I'm not sure if it can 'ignore' data update statements in the SQL delta scripts.

Thanks, Don

A: 

What I do in my tests:

  • I keep a DB version somewhere
  • In the first test, I tear down the whole DB and build it from scratch
  • I run each schema update in an individual test
  • I run the "update DB" module as an individual test (must not do anything because all changes have been applied already). Optionally, I tear down the DB again and run this once.
  • I load the test data into the DB (some of the tests above will do this if they fix data errors).

Now, the test DB is ready for the "real" (application) tests. After each of the application tests, I roll back the current transaction so the test DB never changes after setup.

To make testing faster, I usually have three test suites: One which contains on the DB setup, one which contains only the application tests and one which contains the other two suites. This allows me to quickly reset the test DB and run a single test from the app suite.

Aaron Digulla
A: 

What we have found as the most manageable way to manage the gradual evolution of live/test DB Schemas is using a schema migration management tool like Liquibase

This allows us to apply the latest schema changes to whatever environment we so choose, test or otherwise, in a consistent fashion which then allows us to run whatever kind of automation we wish against the up to date schema.

j pimmel
A: 

I use migrateDB to manage this problem.

This tool is based on the notion that there are "tests" you can perform (via SQL) on your database to see if a given database change has been applied, and a correlating set of actions to perform if a test "fails." For example, you might need to query the meta-table schema to see if a table or column exists, and if it does not, create it. Or, you might want to see if a certain row exists in a table, and if not, insert it. It comes with a few common tests and actions preconfigured, and it's very easy to add your own (with only XML configuration - no new code needed to do this.)

As a little bonus, each of these tests and actions is configured for each "dialect" of SQL (so, for example, you can have an "oracle" dialect and a "mySQL" dialect.) This means that once you define the queries for given tests and actions for each dialect, each new instance of a test or action requires no new SQL, and can be executed against multiple target databases.

Then, you just maintain a small XML file that lists the tests and corresponding actions, and run the tool against your database after each build.

It works quite well for us.

Jared
+2  A: 

I currently use a similar approach. I've been researching db migration tools and haven't found one that addresses the issue you describe.

The problem is that sometimes a schema change requires the data to be altered to allow new constraints to be created, etc... In this case if data update statements were ignored the migration would fail.

Would adding a sql script to your test suite that deletes all the data in the database work for you?

So the process would be:

  1. run db migration.
  2. run script to delete all data in db.
  3. load test data
  4. run tests
Seth Reno
A: 

Here is what we do:

$ find src/sql/ | grep -v /.svn
src/sql/
src/sql/0000-system.sql
src/sql/0000-system.sql.dev.log
src/sql/0000-system.sql.prod.log
src/sql/0000-system.sql.test.log
src/sql/0001-usgn.sql
src/sql/0001-usgn.sql.dev.log
src/sql/0001-usgn.sql.prod.log
src/sql/0001-usgn.sql.test.log
src/sql/0002-usgn.sql
src/sql/0002-usgn.sql.dev.log
src/sql/0002-usgn.sql.prod.log
src/sql/0002-usgn.sql.test.log
src/sql/0003-usgn.sql
src/sql/0003-usgn.sql.dev.log
src/sql/0003-usgn.sql.prod.log
src/sql/0003-usgn.sql.test.log
src/sql/0004-system.sql
src/sql/0004-system.sql.dev.log
src/sql/0005-usgn.sql
src/sql/purge.sql

we have scriptseq###-databaseusercredential.sql

Now our tests, always allow for unknow starting state of the data in the DB. If you could not do that, then I would suggest you use SEQ-CRED-TYPE.sql where type would be dml / ddl and filter out the dml scripts.

Jason Pyeron
+1  A: 

A previous poster listed Liquibase as an option, however they failed to mention Liquibase's ability to define rules which run in particular contexts (Contexts in Liquibase). This allows you to have the schema updates not marked with any particular context and the fixtures for the unit tests marked as a context of test. This way, the fixtures will only be inserted when you run your unit tests.

Here is an example of a Liquibase change set that contains the schema and the fixtures:

<?xml version="1.0" encoding="UTF-8"?>
<databaseChangeLog xmlns="http://www.liquibase.org/xml/ns/dbchangelog/1.9" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://www.liquibase.org/xml/ns/dbchangelog/1.9 http://www.liquibase.org/xml/ns/dbchangelog/dbchangelog-1.9.xsd"&gt;
  <changeSet author="avalade" id="1">
    <createTable tableName="users">
      <column autoIncrement="true" name="id" type="long">
        <constraints nullable="false" primaryKey="true" />
      </column>
      <column name="email" type="varchar(255)" />
    </createTable>
  </changeSet>
  <changeSet author="avalade" id="2" context="test">
    <insert tableName="user">
      <column name="id" value="1" />
      <column name="email" value="[email protected]" />
    </insert>
  </changeSet>
</databaseChangeLog>

Then, if you're using Spring to manage your DAO's, you could put the following in your Application Context file that you're deploying:

<bean id="liquibase" class="liquibase.spring.SpringLiquibase">
  <property name="dataSource" ref="dataSource" />
  <property name="changeLog" value="classpath:dbChangelog.xml" />
</bean>

For the Application Context file that you use in your unit tests, configure Liquibase with an additional context property:

<bean id="liquibase" class="liquibase.spring.SpringLiquibase">
  <property name="dataSource" ref="dataSource" />
  <property name="changeLog" value="classpath:dbChangelog.xml" />
  <property name="contexts" value="test" />
</bean>

This way, you can keep all of your database definitions together in one place and only insert your fixtures when you're running your test code.

Aaron