views:

483

answers:

4

Hi,

I'm writing some integrations tests in JUnit. What happens here is that when i run all the tests together in a row (and not separately), the data persisted in the database always changes and the tests find unexpected data (inserted by the previous test) during their execution.

I was thinking to use DbUnit, but i wonder if it resets the auto-increment index between each execution or not (because the tests also check the IDs of the persisted entities).

Thanks

M.

A: 

Essentially tis issue can be resolved in two ways.

  1. Wrap your DB related tests within a transaction. Before the test begin the transaction. After the test run is over, abort the transaction always abort the transaction. This way no changes made in the test would be retained.

  2. Use something like DBUnit etc. to mock DB operations related classes so that no data ever goes to DB and your classes return results as though DB operation has been done.

If you access to DB when running your tests I prefer approach 1.

Fazal
+5  A: 

It's a best practice to put your database in a known state before a test execution and DBUnit provides everything required for that. But don't rely on auto incremented columns, put them also in your DBUnit dataset. Pros: you can manually verify the database state after executing a test that fails. Cons: you need to setup and to maintain datasets.

The other approach is to run each test method inside a transaction (and to rollback the transaction at the end of the execution). Pros: data are easier to setup and maintain (in the databse). Cons: Fixing a failed test is less convenient.

Pascal Thivent
+1  A: 

You can either run single tests or if applicable groups of tests as a single transaction, and then roll back at the end. (This might be difficult if the tests themselves comprise multiple transactions and your db doens't support nested transactions or savepoints.)

Alternatively, have your test database created by scripts. DbUnit can help here, as can other database generators, such as LiquiBase, dbmaintain, dbmigrate You can then drop the entire database and recreate for each test or test group. The usefulness of this approach diminishes as the test dataset becomes large and overhead increases.

A final option is to not have your tests depend upon the generated id, since depending upon the generated value will create brittle tests. It's useful to test the generated id, so test these values for some tests, but I'm not sure there is value in testing the Ids for all tests.

EDIT: The OP asked about using hibernate to recreate the schema. This can be arranged by creating a new SessionFactory for each test and setting the "hibernate.hbm2ddl.auto" to "true" when building the SessionFactory. I mention the diminishing effectiveness of drop-create - it appies to this case too.

mdma
What about to force Hibernate to manually reset the schema? Maybe in an @After method of my base test class.
Marco
Sure, if your schema is currently already being managed by hibernate then this is a possibility. I've added a little extra to my answer.
mdma
Note that while possible, recreating the schema for each test doesn't scale at all and is really slow. That's not a good idea IMO.
Pascal Thivent
@Pascal Thivent, agreed - drop-create cycle on each test in genral not a good idea, whether it be hibernate schema creation or setup scripts.
mdma
It's right, i feel it's not the right solution too, but as a temporary solution it should work (aka deadline :)
Marco
Anyway something strange happens, i have created the following method in my base test class (in the next comment because of character limit). But unfortunately it doesn't seem to work and it unexpectedly blocks after the second drop-create operation. After many tries, still can't understand why:
Marco
@Before public void before() { Configuration configuration = new Configuration().configure(); SessionFactory sessionFactory =configuration.buildSessionFactory(); Session session = sessionFactory.openSession(); Connection connection = session.connection(); SchemaExport schemaExport = new SchemaExport(configuration, connection); schemaExport.execute(false, true, true, true); session.close(); }
Marco
At a guess, you'll probably want to include an @After method that disposes of the previous SessionFactory. Maybe there are db objects that are locked from the previous test? Can you print a stacktrace to see where it blocks?
mdma
Unfortunately i've changed my code a lot so it's harder now to reproduce the environment. Anyway the problem seems to be in the Hibernate Session management between different test executions. The base class gets the session object to export the schema by calling HibernateUtil.getCurrentSession(), then the test class executes some dao methods that internally get the session object in the same way. When the second test is executed, the HibernateUtil.getCurrentSession() returns an invalid session and all sort of exceptions are thrown by Hibernate. Here is the stack trace of the exception:
Marco
java.lang.StackOverflowError at sun.reflect.GeneratedMethodAccessor24.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25) at java.lang.reflect.Method.invoke(Method.java:597) at org.hibernate.jdbc.BorrowedConnectionProxy.invoke(BorrowedConnectionProxy.java:74) at $Proxy4.createStatement(Unknown Source)
Marco
A: 

It is bad to rely on id values in a test, cause auto-increment are database specific only. So i would never check the id's, cause if you do so your test depends that the entities are filled with particular id values which is not a real life example. A test should be independent of auto-increment id's.

khmarbaise