views:

213

answers:

5

Over the past year or so I have been developing my TDD chops so that I am now fairly good at the essentials - writing tests first, mocking frameworks, testing as small things as possible, DI etc.

However I feel like there are still a lot of things that I am not getting out of unit testing.

For example, I often find that unit testing in this way does not really test the integration and overall bigger picture of what my code is supposed to be doing. With everything mocked out I find that I lose sight of whether or not the methods under test are producing the results that I actually need, rather than just the results they say the will provide. As I start to move towards BDD I find that this problem is only exacerbated, resulting in wasted development time and ineffective tests.

Another problem is that unit tests require a large amount of maintenance to keep them orderly, slowing down refactoring.

When I first started unit testing, like most people I found that what I was writing were really integration tests. However there were many benefits to these tests - they were much easier to read and acted as decent documentation on my programs API. They also tended to catch real world problem much faster, rather than unit tests which I find spend to much time targeting edge cases that would only arise through incorrect use of the API (e.g. null references, divides by 0 etc).

What are your thoughts? Can you recommend good books, articles or practices that tackle more advanced unit testing and maintaining productivity and effectiveness?

EDIT: Just a little follow questions, given the answers: So basically you're saying that despite doing all this unit 'testing' I'm not really be testing the code... to which I reply, 'But I want to test the dang code!' In fact, when I wrote lots of 'heavier' integration tests I found that my code tended to reach a state of correctness much quicker, and bugs were identified much earlier. Is it possible to achieve this without the maintainability problems of integration tests?

+6  A: 

TDD and BDD aren't meant to be tools to measure code quality, they are meant to be tools to aid in designing loosely coupled, highly-maintainable pieces of code. It has more to do about API design than anything else. It's meant to ensure that the code does what it says it does, and does it in a way where changing one part of the code does not affect other parts.

I would expect that your feeling of exasperation with BDD arises from the expectation that you're writing tools to simply "eliminate bugs" or "replace your QA process", both of which neither BDD nor TDD are meant to do. Test Driven Development means "development, driven by tests", and not "tests, driven by development". It appears to me that you want the latter.

Integration testing and software quality assurance are totally different topics, but I understand the reasons behind the massive confusion between these and associating TDD with them.

Test Driven Development means "development, driven by tests", and not "tests, driven by development". It appears to me that you want the latter.

Update Just want to share my blog entry regarding this issue: Repeat after me: Test Driven Development is about design, NOT testing!

Jon Limjap
Unit tests are meant to provide a regression test suite, to let you know when refactoring breaks your application.
Robert Harvey
Unit tests are meant to provide them, yes. TDD isn't.
Jon Limjap
Do you write unit tests separate from the TDD?
Robert Harvey
I write an integration test suite separate from the unit tests used for TDD. Our QA have their own set of tools as well.
Jon Limjap
Just so I'm clear, it sounds like your TDD is a SUPERSET of your Unit Tests?
Robert Harvey
On the contrary, it's a subset. Do TDD first; write tests, complete the code, red green refactor. And *then* write integration tests for DB, persistence, etc afterward, in another test suite.
Jon Limjap
So you consider TDD and Integration Tests to collectively be Unit Tests?
Robert Harvey
Alright, let me ask a different question then. Do you consider TDD to be misnamed? Shouldn't it be Design-Driven, Loosely-Coupled, Highly Maintainable Development (DDLCHMD) or something like that?
Robert Harvey
Yep, it's totally misnamed, that's why "BDD" became popular. Removing "tests" from the name was important, to emphasize that it's not about tests.
Jon Limjap
@Robert, integration tests are not unit tests. Unit tests are about isolated objects; integration tests are about interactions. The collective term I use to refer to both is "automated tests".
Joe White
@Jon, thanks for your reply and Blog post. I now have a much better opinion of TDD, which before I'd only seen in its misrepresented form.
Jacob
+2  A: 

Unit Testing is just one type of testing, it is not the only type of testing.

Unit Testing is supposed to cover the smallest possible unit of work possible, mocking out all dependencies is a very crucial process to achieve this goal, giving that these dependencies have their own unit tests that covers them.

After you cover a decent amount of your small units, you then make what is called Functional Test, which looks like Unit Test however it doesn't mock all the stuff you are mocking in a unit test, generally if your system built by different teams, Functional Tests mocks only the dependencies introduced by other teams, but you team code is not mocked.

After you cover Functional test, you will have the Integration Tests, and here when you start using real dependencies from other teams, in general you shouldn't have any mocking in this kind of tests.

Given the fact that all three types of tests are built using mstest or NUnit, it is still code test.

bashmohandes
+3  A: 

I'm on the same road as you it seems. For me, the book that has become my Unit Testing Bible is xUnit Test Patterns - Refactoring Test Code by Gerard Meszaros.

mezoid
+1  A: 

If you are truly following TDD as described by the practitioners, each test should be testing a relatively small part of your code (just a few lines). By definition, these are not integration tests. The TDD folks will tell you that you need a whole separate suite of tests for doing integration testing.

You are correct, in that such a suite of TDD tests can eventually bog you down in minutiae. Test-driven development, by definition is the creation of, the design of, and the testing of trees, not forests. TDD is the creation of requirements via unit tests but, by its very nature, it embodies requirements at the microscopic level.

If, as the TDD folks claim, you need separate integration tests, then you also need requirements, specifications, and procedures to go with those tests. As you begin going up the food chain, your tests begin getting increasingly complex, until you arrive at the functional/user interface level, where automated testing becomes almost impossible.

Robert Harvey
Rob Conery had an interesting conversation with Scott Hanselman on one of his podcasts during a discussion about his ORM tool, Subsonic. Subsonic must be able to interact with databases from many different vendors. Some of the users of Subsonic were complaining that Rob's unit tests were hitting the database too much (meaning that the users felt the tests should be mocked). Rob's retort was that mocking the database essentially invalidated the tests, since the whole point was to make sure that each vendor's database still worked properly with the tool.
Robert Harvey
A: 

Check out "Object Oriented Software Construction" by Bertrand Mayer.

The concept is called "Contract Driven Development" It's an in-line type of testing at a function level, it's changed the way I program.

If you use CDD in Eiffel, the language also written by Bertrand, they're automatically checked by the runtime during the test and debug phase.

http://www.amazon.com/Object-Oriented-Software-Construction-Prentice-Hall-International/dp/0136291554

http://dev.eiffel.com