Hi there.
At work we've found our test suite has got to the point that's it too slow to run repeatedly, which I really don't like. It's at least 5 minutes over the entire suite, and over 3 minutes for just the back-end data object tests. So, I'm curious to hear how people do their testing.
At the moment, we have a single database server with a live schema and a _test schema. When a test runs, it first runs an SQL script which says how to populate the test database (and clear any old data that might get in the way). This happens for almost all tests. From what I can see, this is the biggest bottleneck in our tests - I have just profiled one test and it takes about 800ms to setup the database, and then each subsequent test runs in about 10ms.
I've been trying to find out some solutions, and here's what I've found so far:
Have the test schema populated once, and rollback changes at the end of each test.
This seems to be the easiest solution, but it does mean we're going to have to add some special case stuff to test things that are dependent on a rollback (ie, error handling tests).
Mock the database where possible
We would setup the database for the data object being tested, but mock anything it depends on. To me, this doesn't seem brilliant for 2 reasons. Firstly, when we set the database up, we still (usually) end up with much more rows due to foreign-key dependencies. And secondly, most data object models don't really interact with other ones, they just do JOINs.
Run the same system, but use dumps and RAMFS
Instead of running a big SQL query we would instead load a database dump. The test server would run on an RAMFS partition, and hopefully bring some speed benefits.
I can't test this though because I'm on OSX and from what I can see, there isn't ramfs support.
There are some other options around like using SQLite, but this isn't an option for us as we depend on some PostgreSQL specific extensions.
Halp! :)