views:

38

answers:

1

I have an application in development that supports several databases including SQL*Server 2008, Oracle 10G, Oracle 11G, MYSQL 5, etc. Already within Maven I have done three things:

1) There is a profile for each database so that the system can use it for integration testing during the build.

2) Maven invokes the hibernate3 plugin to automatically generate a schema script using hbm2ddl.

3) A Hudson matrix build is set up so that integration tests are run against every database automatically.

I have noticed (with no surprise) that the script created by hbm2ddl is different depending on the database dialect.

However, when packaging the system for customers we have to manually go into the various Hudson builds and pluck the database specific scripts from there. A tedious process that I am sure will bite us at the worst possible moment!!

Is there anyway that we can have Maven automatically generate and then gather up all these database scripts so that they can be packaged along side the WAR file we ship to customers? I was thinking of the Maven assembly plugin to zip it all up, but am not sure!

+1  A: 

Is there anyway that we can have Maven automatically generate and then gather up all these database scripts so that they can be packaged along side the WAR file we ship to customers? I was thinking of the Maven assembly plugin to zip it all up, but am not sure!

The problem is that the scripts are generated for each profile run (at least this is my understanding) and unless you package them in some kind of distinct artifacts (could be assemblies) and install/deploy them at each run, you won't be able to grab them in a subsequent step with Maven.

The alternative would be to have a module with several executions of the hbm2ddl goal defined. But I'm afraid this will defeat the whole profile stuff.

Or you could maybe use the M2 Extra Steps Plugin to rely on Hudson and add some post-build steps to your Maven builds (not sure if this can help).

Pascal Thivent
Not sure this will help, but let me know what you think.
Pascal Thivent
After thinking some more, I think it may have been a mistake to have used the profile mechanism of Maven for databases. It was just so darn easy and useful for the developer machines!!! But for the CI build machine...not good.
HDave
@Pascal - I am seriously considering dropping hbm2ddl and going with Liquibase. As the project matures, I am going to have to provide a migration path between versions anyway. It also looks like it's tag facility can handle the loading of integration test data well. My only issue is getting the Maven liquibase plugin to generate SQL scripts in all the dialects....
HDave
@HDave Using a DB migration tool is a very good idea IMO (you'll have to deliver change scripts to upgrade customer DB anyway as you pointed out). And because you have to deal with several DBs, Liquibase seems to be the best option (it even provides some Hibernate support allowing to diff a schema against Hibernate entities, pretty nice). I don't have much experience with their maven plugin though (I've seen several posts where people are using antrun to call it to access all features). And you'll still need several `execution` (and config files) to generate everything in a single build.
Pascal Thivent
@Pascal Thanks for the addition info. I think one of the key points with Liquibase is that I won't be generating database specific SQL scripts anymore. I'll either use the servlet listener (or my customers can run a script) that executes liquibase using the XML change log file directly against their database. Thus I avoid the issue entirely. Is this reasonable as far as you know?
HDave
@HDave Ah yes, you're right, you can totally avoid the (specific) SQL script step and this obviously removes the problem :) And using the servlet listener seems to be a very good idea (need to check how this works in clustered environments but I'll keep this in mind!). This sounds good.
Pascal Thivent