views:

144

answers:

3

Hi in my project we have hundreds of test cases.These test cases are part of build process which gets triggered on every checkin and sends mail to our developer group.This project is fairly big and is been for more than five years.
Now we have so many test cases that build takes more than an hour .Some of the test cases are not structured properly and after refactoring them i was able to reduce the running time substantially,but we have hundreds of test cases and refactoring them one by one seems bit too much.
Now i run some of the test cases(which takes really long to execute) only as part of nightly build and not as part of every checkin.
I am curious as how other guys manage this .

+3  A: 

I believe it was in "Working Effectively with Legacy Code" that he said if your test suite takes longer than a couple minutes it will slow developers down too much and the tests will start getting neglected. Sounds like you are falling into that trap.

Are your test cases running against a database? Then that's most likely your biggest source of performance problems. As a general rule, test cases shouldn't ever be doing I/O, if possible. Dependency Injection can allow you to replace a database object with mock objects that simulate the database portion of your code. That allows you test the code without worrying whether the database is setup correctly.

I highly recommend Working Effectively with Legacy Code by Michael Feathers. He discusses how to handle a lot of the headaches that you seem to be running into without having to refactor the code all at once.

UPDATE:

A another possible help would be something like NDbUnit. I haven't used it extensively yet, but it looks promising: http://code.google.com/p/ndbunit/

HVS
A possible alternate solution for the db bound tests is to use an in memory database. This obviously comes with a new set of problems but is a solution, and it's success is governed with just how much of you code relies on a single RDBMS implementation.
Gareth Davis
Well seems like good idea and also am not concerned about single RDBMS implementation as we have no plans to move away from Oracle.<br>Only thing which worries is to have all the db constraints set in in meory Db and in Sync with the actual db.<br>Note:Can you point to any in memeroy Db which is close to Oracle so that i my test env and actual prod env are in sync.
Khangharoth
A: 

Perhaps you could consider keeping your oracle database but running it from a ram drive? It wouldn't need to be large because it would only contain test data.

A: 

We have about 1000 tests, large percentage of those communicating through REST and hitting database. Total run time is about 8 minutes. An hour seems excessive, but I don't know what you are doing and how complex your tests are.

But I think there is a way to help you. We are using TeamCity and it has a nice ability to have multiple build agents. What you could do is split your test project into subprojects with each subproject containing just a number of tests. You could use JNunit/NUnit Categories to separate them. Then you would configure TeamCity so that each agent would build just one type of subproject. This way, you'd get parallel execution of tests. With few agents (you get 3 for free), you should be able to get to 20 minutes, which might even be acceptable. If you put each agent into VM, you might not even require additional machines, you just need lots of RAM.

bh213

related questions