views:

258

answers:

10

I recently finished a project using TDD and I found the process to be a bit of a nightmare. I enjoyed writing tests first and watching my code grow but as soon as the requirements started changing and I started doing refactorings I found that I spent more time rewriting / fixing unit tests than I did writing code, much more time in fact.

I felt while I was going through this process it would be much easier to do the tests after the application was finished but if I did that I would of lost all the benefits of TDD.

So are there any hits / tips for writing maintainable TDD code? I'm currently reading Roy Osherove's The Art Of Unit Testing, are there any other resources that could help me out?

Thanks

+1  A: 

Yes, there a whole book called xUnit Test Patterns that deal with this issue.

It's a Martin Fowler signature book, so it has all the trappings of a classic patterns book. Whether you like that or not is a matter of personal taste, but I, for one, found it immensely invaluable.

Anyhow, the gist of the matter is that you should treat your test code as your would your production code. First and foremost, you should adhere to the DRY principle, because that makes it easier to refactor your API.

Mark Seemann
+10  A: 

Practice

It takes a while to learn how to write decent unit tests. A difficult project (more like projects) is nothing strange.

The xUnit Test Patterns book recommended already is good, and I've heard good things about the book you're currently reading.

As for general advice it depends on what was hard about your tests. If they broke often, they may not be unit tests, and more so integration tests. If they were difficult to set up, the SUT (System Under Test) could be showing signs of being too complex and would need furthering modularisation. The list goes on.

Some advice I live by is following the AAA rule.

Arrange, Act and Assert. Each test should follow this formula. This makes the test readable, and easy to maintain if and when they do break.

Design is Still Important

I practice TDD, but before any code is wrote I grab a whiteboard and scribble away. While TDD allows your code to evolve, some up front design is always a benefit. Then you at least have a starting point, from here your code can be driven by the tests you write.

If I'm carrying out a particular difficult task, I make a prototype. Forget TDD, forget best practices just bash out some code. Obviously this is not production code, but it provides a starting point. From this prototype I then think about the actual system, and what tests I require.

Check out the Google Testing Blog - this was the turning point for myself when starting TDD. Misko's articles (and site - the Guide to Testable code especially) are excellent, and should point you in the right direction.

Finglas
+1. Love the reference to Misko's articles. I consider them to be a turning point for me too.
Lieven
+1  A: 

Are you making liberal use of interfaces, Dependancy Injection and mocking?

I find that designing to interfaces and then injecting implementations of those interfaces using a DI framework such as ninject makes it a lot easier to mock out parts of the application so that you're properly testing components in isolation.

This then makes it easier to make changes in one area without it affecting others too much, or if the changes do need to propagate, you can just update the interfaces and work through each distinct area at a time.

RSlaughter
+5  A: 

"as soon as the requirements started changing and I started doing refactorings I found that I spent more time rewriting / fixing unit tests"

So? How is this a problem?

Your requirements changed. That means your design had to change. That means your tests had to change.

"I spent more time rewriting / fixing unit tests than I did writing code, much more time in fact."

That means you're doing it right. The requirements, design and test impact was all in the testing, and your application didn't require much change.

That's the way it's supposed to work.

Go home happy. You've done the job properly.

S.Lott
Thinking like this is exactly the reason that I don't embrace TDD. The job of a programmer is not to write test, it's to produce working, good code. If it takes longer to perform the verification that the code is good than to create the actual code, I think something is seriously flawed in the process.
erikkallen
@erikkallen: The job of a programmer is "to produce working, good code" *in which you have complete confidence*. I can't emphasize the *complete confidence* enough. Testing is good way to raise the confidence on the code. It *should* take a long to to verify that the code is good. TDD doesn't change that. Nothing changes that. Confidence requires a lot of care -- either a lot of testing or a carefully-built proof. Or both.
S.Lott
+2  A: 

It sounds like your unit tests are fragile and overlapping. A single code change, ideally, should affect just one unit test - with a one-to-one match of tests to features, other tests don't depend on a given feature. That may be a little too idealistic; in practice many of our tests do re-exercise the same code, but it's something to keep in mind. When one code change affects many tests, it's a smell. Also, with respect to your specific example of renaming: find a tool that will automate these refactorings for you. I believe Resharper and CodeRush both support such automated refactorings; it's a much quicker, easier, and more reliable way to go about refactoring than the manual approach.

To better learn your IDE, nothing beats pairing with someone else. You'll both learn; you'll both develop new skills - and it doesn't take long. A few hours will dramatically increase your comfort with the tool.

Carl Manaster
+1 for recommendation on pairing for learning the IDE. I like to take half a day every couple of weeks or so, and use the time solely to learn something new about my IDE, or some other tool. It pays great dividends in the long run.
Caffeine Coma
+1  A: 

I think you might want to strike a decent balance between testing and coding.

When I start a project, since the requirements and aims change all the time, I hardly write any test at all, because, as you observed, it would take to much time to constantly fix the tests. Some time I just write in a comment "this should be tested" so that I won't forget to test it.

At some point you feel that your project is getting into shape. That's the moment when heavy unit testing comes in handy. I write as much as possible.

When I start doing a lot of refactoring, I do not bother much about the tests until the project has settle down again. I also put some "test this" comments. When the refactoring is over, it's time to rewrite all the failing tests (and perhaps ditch some of them, and certainly write some new ones).

Writing tests this way is really a pleasure, because it aknowledges that your projects has reached a milestone.

Olivier
+2  A: 

I’m a huge fan of unit testing but have experienced problems with TDD (or basic unit testing for that matter) on my most recent project. After conducting a post implementation review I found that we (me and the rest of the team) faced two main problems with our implementation/understanding of TDD and unit testing.

The first problem was that we faced was that we didn’t always treat our tests as first class citizens. I know this sounds like we were going against the philosophy of TDD but our problems came after we’d done most of the initial design and were hurried into making on-the-fly changes. Unfortunately due to time constraints the later part of the project became rushed and we fell into the trap of writing our tests after the code had been written. As the pressure mounted working code was checked into source control without checking if the unit tests still passed. Admittedly this problem has nothing to do with TDD or unit testing but was rather the result of tight deadlines, average team communication and poor leadership (I’m going to blame myself here).

When looking a little deeper into the failing unit tests we discovered that we were testing too much, especially considering our time constraints. Instead of using TDD and focusing our testing on code with a high return we were using TDD and writing tests for the entire code base. This made our proportion of unit tests to code much higher than we could maintain. We (eventually) decided to only use TDD and write tests for business functionality that was likely to change. This reduced our need to maintain a large number of tests which for the most part very rarely (or never) changed. Instead our efforts were better focused and made for a more comprehensive suite of tests on the parts of the application we really cared about.

Hopefully can learn from my experiences and continue to develop TDD or at the least still develop unit tests for your code. Personally I found the following links extremely useful in helping me understand concepts such as selective unit testing.

Kane
A: 

First of all, Refactoring does not break unit tests. Did you apply refactoring by the book ? It could be that your tests are testing implementation and not behavior, which may explain why they break.

Unit testing should be black box testing, test what the unit does, not how it does it.

philippe
A: 

Are you using a good IDE? I was asking myself the same question as you a few years ago when I first embraced Unit Testing. Back then, I was using a combination of Emacs, find and grep to do refactorings. It was painful.

Thankfully, a colleague bonked me on the head and convinced me to try using "modern tools", which in his vernacular meant Intellij IDEA. IDEA is my personal preference, but Netbeans or Eclipse will handle basics just as well. It's hard to overstate the productivity gain this afforded me; easily an order of magnitude gain, especially for large projects with lots of tests.

Once you have an IDE squared away, if you are still running into issues it's time to consider the DRY principle, which seeks to ensure that information is kept in only a single place (constant, property file, etc) so that if you need to change it later, ripple effects are minimized.

Caffeine Coma
A: 

If your tests are hard to maintain, it's a sign that your code is fragile. It means that the definition of the class is changing frequently.

Consider defining a class' responsibilities as calls it makes to an injected interface. Think of it more in terms of passing data, or sending a message rather than manipulating state.

kyoryu