views:

189

answers:

12

I am preparing a presentation about unit testing to managers. My team has been writing unit tests for years now but other areas of the company seem to have a hard time adopting it. I can get developers excited about it, but if there is not a buy-in from management, it will not become a standard.

Do you guys have suggestions how to best approach management about this? What are some things you have tried in the past that worked? What are things that did not work?

+1  A: 

Tell them it makes the code more maintainable and will save them money in the long run. It also reduces bugs, if done correctly.

So in other words, it reduces costs and increases reliability of the software.

hvgotcodes
+1  A: 

To build on @hvgotcodes answer, don't just tell them how it makes things better, compile some stats from your own team of how it has

  1. Reduced turn around
  2. Reduced $ cost
  3. Reduced regressions
  4. etc

Especially the costs, businesses love to save money :)

BioBuckyBall
While I agree with the suggestion in principle, it could be a tough sell if you don't have good data from both before and after implementing unit testing.
GreenMatt
@GreenMatt - True enough. I once did the same thing as the OP and when I was about half way done I stopped for questions. Management said 'Sounds fantastic. Do you have any numbers?' I didn't, and not only did they kill the other teams momentum, but made us scale back time spent on developer testing. Eventually we switched to full on 'let QA find bugs' and I left :)
BioBuckyBall
+1  A: 

Unit testing, if done well, is supposed to

  • catch bugs earlier, thus reducing the amount of bugs found by QA or in production, and also reducing the cost of fixing bugs
  • support refactoring, thus keeping the design clean and extensible, which in the long run should make maintenance cheaper

Are there any relevant statistics collected within your company which can base these claims concretely? I.e. number of bugs found by QA / users in different products, or average bugfix / feature implementation costs, ...

Péter Török
+2  A: 

Don't mislead.

When you cite reduced costs, you're implying that the cost of QA and easier refactoring are greater than the price of creating and maintaining the tests. It would be good to point that out and to try and put in place some metrics to bolster your point.

The problem is that you can't put a dollar value on the path not taken ("We avoided 1,000 defects that would have cost $1M to fix!").

But you should be up front about the fact that there's a cost to creating the tests, and they must be maintained. If you create them and then let them fall into disrepair you're just as bad off.

Your argument will get a boost when you go to add new features and it's easier because you've kept the code clean.

duffymo
+7  A: 

For non-tech managers I have fallen back on an analogy of building a house. Would they build one without blueprints, just taking bricks from a pile and putting one atop the other with a vague house-shaped idea in mind? If not then why won't they accept proper documentation before coding, design reviews, etc?

For unit test, you could try comparing it to quality testing the various house components individually before putting them together (bricks, cement, doors, windows, plumbing)

For those with any tech grasp at all, I mention that 30% to 50% handles error conditions and recovery (in mission critical embedded systems) and that you just can't test that without unit testing. E.G, if you only blackbox integration test then how can you - in a controlled and repeatable manner -test what happens when module A can't allocate memory at a given point, or a timeout expires deep in module C when waiting for a reply message from somewhere, or a database read which will normally succeed. Explain that by dummying out interfacing modules you can simulate their erroneous behaviour at will.

And, of course, I draw my favouriite graph with a $ symbol on one axis and a clock on the other. I beat management over the head with this at every opportunity to show that the cost of getting bugs out increases the later they are discovered (change a line in a requirement spec – 5 minutes; a few paras in a design doc – hours; several 100 lines of code (at code review or unit test stage) – days; finding the needle in a haystack bug and correcting it at system test – weeks).

It's not just unit test – you have to convince management – and your fellow developers – that the “unnecessary overhead” of documentation, reviews, testing ... s/w Processes in general (and that includes learning new tools) ... actually saves time rather than adding time to a project. In other words, it takes longer and costs more to get it wrong. Measure twice, cut once, etc

For unit test, tell them about continuous integration (http://en.wikipedia.org/wiki/Continuous_integration). I strongly recommend Hudson, but use whatever works for you. Explain that a regular build, whether nightly or with every check-in, can automatically pull unit tests from your VCS and run them, sending email or otherwise alerting to new code that broke existing tests almost as soon as it happens.

If none of that works, look for a new job (if you are in Singapore, or want to be, talk to me ;-)

LeonixSolutions
+2  A: 

I think statistics would make the best case as far as nontechnical managers are concerned. Code Complete, 2nd Ed. has a summary of a couple of studies showing that unit testing leads to an average of 30% defect detection rate (Table 20-2, p 470). I think it should be easy to take it from there and make the case that less bugs translate to more money saved.

ShaderOp
+2  A: 

In some cases I have found that waiting for the right opportunity is key.

For example in one case I waited until there was a hot site with a customer. It was very painful (think $$$). One of the unavoidable questions asked by upper management was how to avoid the same issue in the future. And that's when I presented unit testing to the management...

So I guess it depends a lot on the circumstances.

John
+2  A: 

Show them an example of where testing already caught an issue and saved the day.

Andy
Similarly, I wrote a new feature a few years ago that was one of those areas with lots of very fiddly corner cases, such that fixing one would likely break some of the others. I used a unit testing/TDD method for it and in 5 years not a single bug was found in it because they were all caught in development. Compare that with other features that weren't unit tested and had to be fixed up time after time.
the_mandrill
I can find *many* cases when we almost lost multi-million dollar customers for simple corner cases that could have been tested in minutes before even left development. I could even try to dig up the code from svn and write a test for it and see it fail. Of course this code will probably be a mess therefore hard to unit test. I will still dig in a little into doing something like this. It would really show the contrast between how easy it could have been versus how hard and costly it was.
c_maker
I'll add to this that you can also show them a case where unit testing sped up code development immensely. My example is back when I was working a on reporting system. Towards the end of development, product suddenly decided that everything needed to be done in EST instead of GMT (GMT makes the code easier). Since we had well tested code, we could make the changes we needed with much less fear that we missed a piece of code somewhere that needed to be updated.
RHSeeger
+1  A: 

If possible I would try and provide real world examples of the difficulty your company has faced in the past when supporting live code without tests. Then go on to highlight perhaps with specific examples the reduction in support costs for updates and fixes had tests been in place.

Good luck and let us know how you get on. :)

Daz Lewis
+1  A: 

Present it as Behavior Driven Development or Test Driven Development, not just as unit testing.

Both BDD and TDD have upsides that are easily understood by non-technicians. In fact, one of their key advantages is that they focus on the non-technical.

Unit testing, on the other hand, is a technical concept which can be very abstract to the non-programmer. Test your code? Don't you do that while developing? Why are we paying for test staff? And so on.

bzlm
+2  A: 

1 - Do they need to know, or even approve? You are the engineer and you know better than them how to build stuff that works, so they really should not interfere. Would they care what tools and what quality/safety assurance methods you used if it was a new car model you built for them? Wouldn't think so.

2 - Cost of defects is very, very high. Low quality, defects and technical debt is a very serious project risk that may jeopardize the entire investment of a software project. Defect prevention is easilly more cost effective than defect detection (many multiples according to studies). Unit tests have the shortest feedback loop possible, thus avoids systematic low project quality and significantly reduces project risk.

3 - You can't go fast over a long period of time without a very low defect rate - aka, Investing beats spending/speculating (this they should get).

Mahol25
A: 

Draw an analogy to something the managers probably use on a regular basis - like a spell checker in their mail client or document editor....

VoiceOfUnreason