views:

608

answers:

5

Our team has hundreds of integration tests that hit a database and verify results. I've got two base classes for all the integration tests, one for retrieve-only tests and one for create/update/delete tests. The retrieve-only base class regenerates the database during the TestFixtureSetup so it only executes once per test class. The CUD base class regenerates the database before each test. Each repository class has its own corresponding test class.

As you can imagine, this whole thing takes quite some time (approaching 7-8 minutes to run and growing quickly). Having this run as part of our CI (CruiseControl.Net) is not a problem, but running locally takes a long time and really prohibits running them before committing code.

My question is are there any best practices to help speed up the execution of these types of integration tests?

I'm unable to execute them in-memory (a la sqlite) because we use some database specific functionality (computed columns, etc.) that aren't supported in sqlite.

Also, the whole team has to be able to execute them, so running them on a local instance of SQL Server Express or something could be error prone unless the connection strings are all the same for those instances.

How are you accomplishing this in your shop and what works well?

Thanks!

+4  A: 

I'm a java developer but have dealt with a similar problem. I found that running a local database instance works well because of the speed (no data to send over the network) and because this way you don't have contention on your integration test database.

The general approach we use to solving this problem is to set up the build scripts to read the database connection strings from a configuration file, and then set up one file per environment. For example, one file for WORKSTATION, another for CI. Then you set up the build scripts to read the config file based on the specified environment. So builds running on a developer workstation run using the WORKSTATION configuration, and builds running in the CI environment use the CI settings.

It also helps tremendously if the entire database schema can be created from a single script, so each developer can quickly set up a local database for testing. You can even extend this concept to the next level and add the database setup script to the build process, so the entire database setup can be scripted to keep up with changes in the database schema.

Ken Liu
+8  A: 

in NUnit you can decorate your test classes (or methods) with an attribute eg:

[Category("Integration")]
public class SomeTestFixture{
    ...
}
[Category("Unit")]
public class SomeOtherTestFixture{
    ...
}

You can then stipulate in the build process on the server that all categories get run and just require that your developers run a subset of the available test categories. What categories they are required to run would depend on things you will understand better than I will. But the gist is that they are able to test at the unit level and the server handles the integration tests.

grenade
+1 that's pretty much what we do at my work. On our CIS, unit tests are run every checkin and integration tests once a day.
mezoid
where I work we run the integration tests on every build. the more frequently you run the integration tests, the better.
Ken Liu
@Ken_Liu - for sure. Often it's a case of balancing the available cpu cycles on the CI server against the number of commits per moment. This will vary for every development team/environment.
grenade
+2  A: 

Have you done any measurements (using timers or similar) to determine where the tests spend most of their time?

If you already know that the database recreation is why they're time consuming a different approach would be to regenerate the database once and use transactions to preserve the state between tests. Each CUD-type test starts a transaction in setup and performs a rollback in teardown. This can significantly reduce the time spent on database setup for each test since a transaction rollback is cheaper than a full database recreation.

henrik
Yeah, I've seen this idea before, but what if some of the code you test uses transaction itself (remember, these are integration test)? Then everything will break, so I don't think this is practical in general.
sleske
If you're doing explicit transaction management inside your methods you're absolutely right. This approach works best in cases where you use aspects or a proxy to handle transactions. We used this approach in a project based on Spring and the integration test setup replaced the proxy with the unit test managed transaction. Worked like a charm.
henrik
+3  A: 

We have an SQL Server Express instance with the same DB definition running for every dev machine as part of the dev environment. With Windows authentication the connection strings are stable - no username/password in the string.

What we would really like to do, but haven't yet, is see if we can get our system to run on SQL Server Compact Edition, which is like SQLite with SQL Server's engine. Then we could run them in-memory, and possibly in parallel as well (with multiple processes).

orip
mmm SQL Compact. that *would* be sweet!
grenade
i like the idea of that, for sure!
Chris Conway
+3  A: 

Keep your fast (unit) and slow (integration) tests separate, so that you can run them separately. Use whatever method for grouping/categorizing the tests is provided by your testing framework. If the testing framework does not support grouping the tests, move the integration tests into a separate module that has only integration tests.

The fast tests should take only some seconds to run all of them and should have high code coverage. These kind of tests allow the developers to refactor ruthlessly, because they can do a small change and run all the tests and be very confident that the change did not break anything.

The slow tests can take many minutes to run and they will make sure that the individual components work together right. When the developers do changes that might possibly break something which is tested by the integration tests but not the unit tests, they should run those integration tests before committing. Otherwise, the slow tests are run by the CI server.

Esko Luontola