views:

329

answers:

4

Hi guys,

I am trying to increase the overall Integration test execution time and I am currently evaluating various in-memory db solutions. The idea is to have DAOs hit in-mem db during the tests as opposed to hitting a real DB. This is a java app using Hibernate for persistence.

I'd be interested to see your experience with one of these products H2, Derby, HSQLDB, Oracle Berkeley DB.

Some of my concerns are: will in-mem DBs be able to execute stored procedures, custom native sql? Can you selectively choose which one of your services should hit real DB vs in mem DB?

And overall, since this approach involves DB bootstrapping(pre-load/pre-create all tables with data) I am now thinking if it'd be simply easier to just mock out the DAO layer and not even worry about all the unknown problems that in mem DB may bring...

thanks.

+1  A: 

Mock out the DAO layer.

Despite what some claim unless you are just using trivial sql the subtle implementation differences and differing feature sets between databases will limit what you can do (stored procedures, views etc.) and also to some extent invalidate the tests.

My personal mocking framework of choice is Mockito. But there are lots that do the job and mocking out the DAO is standard practice so you'll find lots of documentation.

Pablojim
Mocking DAO requires too many efforts, and gives very little feedback.
stepancheg
With modern mock frameworks, mocking the DAO layer should require very little effort.
Pablojim
It only requires effort if you don't have it cleanly separated. I mock my DAL all of the time in .NET, it requires almost not effort when I use Moq.
RichardOD
+1  A: 

It is bad idea to have different databases for unit-testing and for production.

BTW, testing in real database should be fast, probably you are doing something wrong in your tests.

stepancheg
Personally I think that if testing on your real database is slow, that is important information and means the queries will probably perform unacceptably on prod too! I see no value in testing on some other backend than the one you intend to actually use. You might learn that the queries work, but that is only half of what you need to test unless you like reverting to the old code because the new code times out on prod.
HLGEM
+2  A: 

My suggestion is to test everything, including the DAO layer as you mention. But see if you can test it in pieces. Services, DAOs, UI.

For service layer testing, mock out the DAOs. That way the service layer tests are independent of whether the DAOs are working. If the service layer tests are using DAOs and using a real database then I'd argue that it's not really a Unit test but an Integration test. Although those are valuable too, if they fail it doesn't pinpoint the problem like a Unit test.

For our DAO layer tests we use DbUnit with HSQLDB. (Using Unitils helps if you are using Spring/Hibernate/DbUnit to tie it all together.) Our DAO tests execute nice and quickly (which is important when you have 500+ tests). The memory db schema is built from our schema creation scripts so as a side effect we are testing those as well. We load/refresh a known set of data from some flat files into the memory database. (Compared to when we were using the DEV database and some data would get removed which then broke tests). This solution is working great for us and I would recommend it to anyone.

Note, however, that we are not able to test the DAO that uses a stored proc this way (but we only have one). I disagree somewhat with the poster who mentioned that using different databases is "bad" -- just be aware of the differences and know the implications of doing so.

You didn't mention if you are using Hibernate or not -- that is one important factor in that it abstracts us away from modifying any SQL that may be specific to Oracle or SQLServer or HSQLDB which another poster mentioned.

dustmachine
dustmachine - thanks for your reply, it's very helpful. Yes, we are using Hibernate and the underlying DB is Oracle. We have a lot of stored procs and a lot of crazy native sql.
I agree that you should run integration tests. However testing on a different database to the one you will ultimately use is of limited value - what exactly is it proving? And if the only gain from this apprach is speed you need to question why are your integration tests so slow. With dbunit and small sets of test data your integration tests on Oracle should be very fast.
Pablojim
Frameworks like Hibernate allows testing without caring what database is on the back end. UserDAO's findUserByName method performs as expected whatever the db vendor. If the contract says case-insensitive for example.Testing against a live database is brittle because it requires data to be in a steady state. Developers need to remember not to change certain records or tests need to perform lots of setup before each test. This can be overcome but having a fresh database each time is easier esp with lots of FKs. In-memory databases will always be faster when running 500 tests.
dustmachine
A: 

I just came across Oracle Times Ten in mem db. http://www.oracle.com/technology/products/timesten/index.html

This may seem like possibly the most painless solution. Since no additional mocking/configuration is required. You still have all of your Integration tests intact hitting the DB but now the data is delivered faster. What do you guys think ?