views:

202

answers:

3

In our project we have a plenty of Unit Tests. They help to keep project rather well-tested.

Besides them we have a set of tests which are unit tests, but depends on some kind of external resource. We call them external tests. They can access web-service sometimes or similar.

While unit tests is easy to run the integrational tests couldn't pass sometimes - for example due to timeout error. Also these tests can take too much time to run.

Currently we keep integration/external unit tests just to run them when developing corresponding functionality.

For plain unit tests we use TeamCIty for continuous integration.

How do you run the integration unit tests and when do you run them?

+1  A: 

We run all the tests in one huge suite. It takes 7 minutes to run.

Our integration tests create mock servers. They never time out -- except when the test requires the server to time out.

So we have the following kinds of things. (The code sample is Python)

class SomeIntegrationTest( unittest.TestCase ):
    def setUp( self ):
        testclient.StartVendorMockServer( 18000 ) # port number
        self.connection = applicationLibrary.connect( 'localhost', 18000 )
    def test_should_do_this( self ):
        self.connection.this()
        self.assert...
    def tearDown( self ):
        testClient.KillVendorMockServer( 18000 )

This has some limitations -- it's always forking the client mock server for each test. Sometimes that's okay, and sometimes that's too much starting and stopping.

We also have the following kinds of things

class SomeIntegrationTest( unittest.TestCase ):
    def setUp( self ):
        self.connection = applicationLibrary.connect( 'localhost', 18000 )
    def test_should_do_this( self ):
        self.connection.this()
        self.assert...

 if __name__ == "__main__":
     testclient.StartVendorMockServer( 18000 ) # port number
     result= unittest.TextTestRunner().run()
     testclient.KillVendorMockServer( 18000 )
     system.exit( result.failures + result.errors )

To support this testing, we have a number of mocked-up servers for various kinds of integration tests.

S.Lott
Do you require developers to run the tests before a check-in?
matt b
That's good approach to mock-up remote services.I think we don't do that because sometimes it's hard to do mock-up - or it can take some time.Also sometimes I don't even know how remote ws works.Third thing is that sometimes it's helpful to test that remote service works fine as expected. We had a cases when major service provider had issues that could potentially found by some external tests.
Vladimir
@Vladimir: "sometimes I don't even know how remote ws works". False. You know what your application sends and receives. That's all you have to handle in the Mockup. Nothing more, just enough to make the test pass.
S.Lott
@matt b: Of course. Yes, it takes time to run the complete test suite. Software is complex and difficult and it takes real work to test.
S.Lott
@S.Lott, I agree, that we should mockups which are proof that remote ws works as we expect. But there is another stage - when you first time develop with certain remote ws. Or when ws changed the protocol.Do you have such tests which checks that ws work as you expect in your mockups?
Vladimir
@Vladimir: Those aren't "tests". "first time develop" and "when ws changed the protocol" aren't things you find by testing. That's "design" and before that "requirements". Nothing to do with testing, except, you may write some small software (we call them "spike solutions") to determine what the protocol *really* is. But this is not "testing": this is finding out what requirements you have and creating a designing.
S.Lott
@S.Lott I believe they are tests too. "first time develop" is a kind of Learning Tests (I believe you;ve heard about them). And if you develop with rather small ws which is quite stable in it's API it's ok, but in live world we are working with rather live major ws which updates API rather often and we even find bugs sometimes in their ws.The proof of that is that today we found a bug which reflects that we don't know how remote ws should communicate with our software :-) To clarify that we should ask the support of remote ws and writing a good test would be nice too.
Vladimir
@Vladimir: "Learning Tests" is a contradiction. You can only "test" when you have **Expected Results** and your software is shown to produce the **Expected Results**. If you're exploring or learning something new, that does not fit the definition of "test". That's "Exploring" and "Learning", not "Testing". I realize that many times you must explore web services to understand a change. I understand that when you learn something you find that software is wrong. None of that is "testing". That's learning.
S.Lott
+2  A: 

We're using Maven2: maven-surefire-plugin to run unit tests (in the test phase) and maven-failsafe-plugin for integration tests (integration-test phase).

By default, all tests run when the project is built, however integration tests can be turned off using profiles.

In many cases integration tests are the part of the module, n some cases there are also dedicated modules which only do integration tests.

One of the teams also uses Fitnesse for acceptance testing. These tests are also in dedicated modules.

We're using Hudson for CI.

lexicore
Thanks for mentioning Fitnesse - I've heard of it before. And now just watched the presentation - it's from Robert Martin. Looks good. I will try to think how we can integrate it for our purposes! Do you have a lot of acceptance tests in Fitnesse?
Vladimir
My team - not, the other team has really a lot of things done as acceptance tests.
lexicore
+1  A: 

In our project we have separate suite for regular/plain unit tests and separate suite for integration tests. The are two reasons for that:

  1. performance: integration tests are much slower,
  2. test fragility: integration tests fail more often due to environment-related conditions (give false positives).

We use TeamCity as our main Continuous Integration server and Maven as build system. We use the following algorithm to run the tests:

  1. We run unit tests at within Eclipse IDE and before every commit.
  2. We run unit tests automatically after each commit on TeamCity agents using Maven's mvn clean install
  3. We run integration tests automatically on TeamCity agent after "main" build is completed.

The way we trigger integration tests execution is by configuring TeamCity's integration.tests task to be dependent on "main" continous.build task, see here for details: http://confluence.jetbrains.net/display/TCD4/Dependencies+Triggers

We run only integration tests (excluding unit tests) by:

  • using separate directory named "src/it/java" to keep integration tests,
  • excluding by default this source folder from maven-surefire-plugin configuration (configuration/excludes element),
  • using Maven profile called "integration" to exclude regular unit tests and include tests from "src/it/java" (this profile is configured by passing -Pintegration in integration.tests task).
koppernickus
This is the most close pick for me, except that we would replace some integration tests to unit tests with mockups.
Vladimir