views:

338

answers:

11

Given a short sprint, is it ever acceptable to forgo TDD to "get things done" within the sprint.

For example a given piece of work might need say 1/3 of the sprint to design the object model around an existing implementation. Under this scenario you might well end up with implemented code, say half way through the sprint, without any tests (implementing unit tests during this "design" stage would add significant effort and the tests would likely be thrown away a few times until the final "design" is settled upon).

You might then spend a day or two in the second week adding in unit / integration tests after the fact.

Is this acceptable?

+7  A: 

If you can accurately code something without tests, why use unit tests at all? Unit tests should either help you write code faster or help write better code. If it doesn't do either, don't use unit tests.

Andrew
Unit tests aren't necessarily about helping you write code; they can be about many, many other things, including preserving functionality and providing "living" documentation of functionality.
McWafflestix
@McWafflestix - unit tests cost money to write, which means you need to cost justify them. Management won't be happy if it took you 25% longer to finish a project because of "preserving functionality" and providing "'living' documentation" of your code.
Andrew
Management at YOUR company might not be happy; management at MY company happens to expect about 50% more time (or more; at least 33% of the time on a piece of code) to be spent on quality assurance of code, of which unit tests are a large part.
McWafflestix
Which I covered ... "Unit tests should either help you write code faster or help write better code". I would argue that preserving functionality and living documentation provides neither though ... they are wasted effort to management, and simply a nice-to-have for a developer.
Andrew
You missed this reason: **Writing the test gives you coverage for regression testing.** Without decent coverage of all functionality you can't refactor mercilessly because you have to keep worrying about things breaking you don't have tested. You might not want to write tests in places where you feel confident, but I bet you want other people to write tests on stuff you have to maintain ;-)
cartoonfox
I agree, but integration tests are typically better at regression tests of behavior, and allow you to make larger changes with confidence. Unit tests make it difficult to create large changes to your app, and productivity often suffers without positively impacting quality. A full suite of integration tests, OTOH, can still provide full code coverage, but provides you more freedom.
Andrew
I have found that TDD works well for _designing_ the code in the first place, works well for documenting to new users _how_ to use the code, as well as _catch_ if maintainers accidentially breaks the code during maintainence. Often code changes can reach far, much further than you think.
Thorbjørn Ravn Andersen
+2  A: 

To my mind this can be a dangerous trade-off to take. While it may be acceptable some of the time, it does set a precedent that can be hard to overturn. I know where I work, for some complicated things, we tend to have to implement something a few times in order to finally arrive at a good implementation as the idea of something better tends to come along every so often in our project.

Usually, it is in building the tests that missing requirements can be caught and that a more complete design can be drawn initially before getting a lot of code as otherwise there is a pile of code that one may want to try to reuse so much rather than throw away and build something better from scratch.

The key is to be able to follow through on those tests in the second week rather than focus on something new that seems to be a priority one work item. While this can be fine for some, for others it can be a recipe for disaster, IME.

JB King
+10  A: 

I would say that it's almost always acceptable to bypass any process if it means that you complete a project that you wouldn't otherwise be able to complete. The processes should be there to help you, but if your process is not helping, then don't use it (after discussing it with the rest of the team first, of course).

However bypassing TDD can easily result in the exact opposite effect - you could write buggy code that just before you need to ship requires a rewrite as final testing shows up critical problems that you should have spotted sooner, so think carefully before doing so.

If you do skip unit testing to get something out the door and are lucky enough that it works, you should see it as a technical debt that should be paid back as soon as possible.

Mark Byers
On the basis of your answer it sounds like the sprint should be stopped at the end of week one when the task looks to big to complete in a TDD manner and a new sprint backlog created. In an ideal world or course; in reality, this would certainly take an enlightened project team to accept.
Ben Aston
Normally you would first try less severe alternatives, for example removing some functionality but still delivering the remaining functionality with a high quality. With a two-week sprint, removing something might not leave much, but if it's a possibility I'd try that option first.
Mark Byers
"I would say that it's almost always acceptable to bypass any process if it means that you complete a project" - would the project be "complete" if the code sort of worked but every programmer was too scared to touch it because it's untested and shoddy?
cartoonfox
You missed a bit: ... "that you wouldn't otherwise be able to complete". I.e. if it's a difference between getting paid for it, or not getting paid because you broke the contract, then it is *sometimes* acceptable to choose getting paid even if it means taking a shortcut. Of course the management should very carefully weigh up the risks involved but there is no answer that always is right. Sometimes you need to get things out the door **now**, even if it means more work next week.
Mark Byers
@ Cartoonfox: "Fast, Good and Cheap, pick any two" - it's the client choice, and it's their funeral. It makes me sick to write low quality code, but sometimes that's what is asked. That being said, it's usually because of short-sightedness, and what you get for cheap now will usually cost you an arm and leg to get right later.
Mathias
+4  A: 

In my experience, writing proper unit tests take at least as long as writing the code that it tests, so I have a hard time seeing how you're going to write tests in "a day or two".

That said, what is "acceptable" depends on a lot of things. Don't treat testing as a religious issue. The tests exist to give you a certain level of confidence in your code. You just need to realize that by foregoing testing you are also increasing your risk. Sometimes this is appropriate, other times it isn't.

The two extremes to be careful of:

  • Saying you'll write the tests later, and then never getting around to it. This is "borrowing from the future", and you can quickly accumulate more debt than you can pay back.

  • Spending more time on testing than warranted. There are certain risks that are so small that there isn't any point in testing for them if the expected cost of something going wrong is less than the cost of writing the test.

Laurence Gonsalves
If I am to spend 50% of my time unit/integration testing then I will have to substantially reduce the team velocity as perceived by the project team. How best to manage this?
Ben Aston
@Ben Aston: make the point clear to the project team that quality, reliable software takes more time to write than shoddy, unreliable software. The idea is relatively simple...
McWafflestix
@Ben - This is why most companies don't test. Studies have shown that unit testing catches about 10-20% more bugs than "traditional" manual testing by a developer. This sounds good ... if testing only takes 10-20% extra time, which is VERY rare. Integration tests, OTOH, have far more benefit both to the Quality of the product and longer term test needs (i.e. Regression testing).
Andrew
@McWafflestix - if unit tests are what separates your company's software from shoddy software, you might want to find a new set of developers.
Andrew
@Andrew: if trust in your developers is the only quality assurance your company has in place, you might want to find a new company. Good intentions only go so far; bugs happen, even to the best developers. And to address your other point; the percentage of bugs to the perceived quality is not a linear relationship; at some point, having 10% more bugs in your software may lose you more than 10% of your business. Good management knows how to make those decisions. (Bad management only thinks they do.)
McWafflestix
@McWafflestix - but unit testing doesn't help you in the situation your lay out, whether you trust your developers or not. I am not railing against unit testing, I am simply stating that it should not be done for the wrong reasons ... and certainly shouldn't be done if it extends the delivery schedule appreciably.
Andrew
@Andrew: all I'm saying is that unit testing DOES increase delivery schedule, sometimes appreciably; it provides value by providing continuing automated quality assurance. Good management knows that quality assurance and planning for it is valuable; and the decision as to whether or not to accept later delivery times with the benefit of increased ongoing quality is fundamentally a scheduling / staffing decision; i.e. management decisions. The determination as to whether or not to do testing even if it extends delivery schedule is fundamentally one that needs to be made by management.
McWafflestix
+1  A: 

From your description you aren't really doing TDD, since the tests are not driving the design. It is more the case that you are doing unit testing, which is still a good thing.

In terms of the basic question, can you forgo tests, really this is something that only experience (of the team) can guide you with, in terms of what is the tradeoff. Generally you come to regret it, but sometimes a quick bad hack is better than not delivering at all.

Yishai
+1  A: 

Maybe I'm missing something, but you're either doing TDD according to the generally agreed principles, or you're operating like a TDD cargo-cult.

If your team/company has made the decision to use TDD, and intends to stick with it, you either need longer sprints, or you need to reduce the amount of functionality to be completed within the 2 weeks.

Ash
+8  A: 

A 2 week iteration isn't short for a lot of people. Many of us are doing one week iterations. Kent Beck is even trying to encourage daily deployments - and there are advantages in cleaning the dev process up so it can be that responsive.

NEVER reduce TDD quality to get stuff out - It's so much harder to clean up later and you just end up teaching the customer that they can pressure you into quick, dirty, hacked releases. They don't see the crap code that gets produced as a result - and they don't get to maintain it. If somebody tried to get me to do that I'd quit... and I have refused to work in places that "don't have time to test properly". That's not an excuse that works.

NOTE: When I write about TDD, I'm including functional tests. These are important because they should exercise scenarios that make sense to the customer in terms of recognizable user-stories. I normally start work on a story with the functional test because it's the most important test - "test customer gets what they described..." All the other tests might be up for negotiation, but when I'm team leading I expect at least one functional test per story or it's (as scrum people say) "not done!" ;-)

Don't think you can go in and add tests later - it's so much more difficult to do that. (I have tried it both ways - believe me.) It really is cheaper to put tests in as you go - even if you have to refactor and rewrite, or throw some away as the system evolves.

You can't get quality code without having decent code coverage ALL the time.

Code test coverage is the important word here. Covering stuff that could break, not just zillions of meaningless tests - critical tests that cover things you need to worry about.

If you can't get it out in time and it's a problem you need to think why?

  • Is it a planning/release scheduling problem?
  • Is there a code maintenance/refactoring problem that holds things up?
  • Are there people putting unreasonable pressure on the team? (get the CTO to beat them up...)
cartoonfox
Suggested solutions, reference to Beck and personal experience, hence marked as answer. Other answers good too...
Ben Aston
@Ben - don't hold up Kent Beck too high ... one of the most famous failed TDD projects was his at Chrysler. TDD works if you have LOTS of time. But management usually doesn't give enough time to make TDD successful, so you end up with a set of shoddy tests. You might have decent code coverage, but poor functional coverage, which gives your team a false sense of security.
Andrew
Chrysler people wanted to stop using Smalltalk... and lots of other reasons. Loads of influential people on that project - lots going on there.... highly political environment. **It's not fair and hopelessly oversimplified to blame the whole thing on Kent Beck introducing TDD.** Whether or not you end up with lots of shoddy tests is down to the team's discipline and experience. TDD has worked in 100% of places I've worked. **It's not easy to learn TDD well though - so if it doesn't work for you get some experienced help** and step back to think about what you could do better.
cartoonfox
I think "never reduce TDD quality to get stuff out" is a bit strong. See my answer.
WW
Okay - you reduce TDD quality if you want to, but I wont and I don't want to maintain your stuff if you're going to do that. ;-)
cartoonfox
I just want to note that improving TDD *quality* might actually mean ending up with less tests i.e. getting rid of several duplicated, bad tests and putting in fewer, better thought through ones...
cartoonfox
+1  A: 

implementing unit tests during this "design" stage would add significant effort and the tests would likely be thrown away a few times until the final "design" is settled upon

I've found that Test-Driven Design takes me to the final "design" faster: in fewer iterations, and often the first shot is the last. Design iterations are normally needed because the program is designed without any reality checks, and subsequent implementation efforts and attempts at use drive the redesigns. With TDD, the reality check is in place all the time. That can be enough to make sure you don't miss the mark and have to redesign the thing.

just somebody
A: 

If you are not shipping code then what good are you as a developer? The most important thing is to get code in front of your users. If is means sacrificing one part of the process to complete a project then do it. The CEO is not going to give you a bonus for implementing TDD and not shipping.

SWD
A: 

Good tests double the time it takes to finish. If the management team dings you for taking longer, well then, this is where the metal hits the meat in real life projects, isn't it?

I find TDD to be a zealot's practice, wastes time making tests fail, the compilation errors, refactoring tests and code to death, and the like. In the end, as long as you get good test coverage of the code you can save time by creating unit tests immediately after the feature compiles and starts to work.

McG
"Good tests double the time it takes to finish." This flies in the face of most [studies](http://www2.imm.dtu.dk/pubdb/views/edoc_download.php/5571/pdf/imm5571.pdf) I've seen on the matter, where good tests end up with better-maintainable code in about the same time or *less* time than code without good tests. What citation do you offer for your contrary assertion?
bignose
A: 

This is a business decision rather than a technical one. Does the code need to truly work or just look like it works?

Here is one time it would be acceptable:

You have a new product that hardly anyone is using and your head sales person is going to some industry conference to demonstrate it. You trust her to dance around any bugs you leave in. The conference will build interest in your product and sales will flow in the following 3-6 months. Your company has 8 months cash left in the bank.

Here is one time it would not be acceptable:

Your code controls x-ray hardware in 5,000 hospitals. You are rolling out a new version. You don't want to kill people. Your company will be sued into oblivion if you make big mistakes.

There is a speed to development vs quality tradeoff to be made. That is a business decision. Let's hope you've got a manager willing to understand that.

WW