tags:

views:

349

answers:

8

The software development team in my organization (that develops API's - middleware) is gearing to adopt atleast one best practice at a time. The following are on the list:

Unit Testing (in its real sense), Automated unit testing, Test Driven Design & Development, Static code analysis, Continuous integration capabilities, etc..

Can someone please point me to a study that shows which 'best' practices when adopted have a better ROI, and improves software quality faster. Is there a study out there? This should help me (support my claim to) prioritize the implementation of these practices.

A: 

There are some references for ROI with respect to unit testing and TDD. See my response to this related question; http://stackoverflow.com/questions/237000/is-there-hard-evidence-of-the-roi-of-unit-testing#237067.

tvanfosson
+4  A: 

"a study that shows which 'best' practices when adopted have a better ROI, and improves software quality faster"

Wouldn't that be great! If there was such a thing, we'd all be doing it, and you'd simply read it in DDJ.

Since there isn't, you have to make a painful judgement.

There is no "do X for an ROI of 8%". Some of the techniques require a significant investment. Others can be started for free.

  • Unit Testing (in its real sense) - Free - ROI starts immediately.
  • Automated unit testing - not free - requires automation.
  • Test Driven Design & Development - Free - ROI starts immediately.
  • Static code analysis - requires tools.
  • Continuous integration capabilities - inexpensive, but not free

You can't know the ROI. So you can only prioritize on investment. Some things are easier for people to adopt than others. You have to factor in your team's willingness to embrace the technique.

Edit. Unit Testing is Free.

  • "time spend coding the test could have been taken to code the next feature on the list" True, testing means developers do more work, but support does less work debugging. I think this is not a 1:1 trade. A little more time spent writing (and passing) formal unit tests dramatically reduces support costs.

  • "What about legacy code?" The point is that free is a matter of managing cost. If you add unit tests to legacy code, the cost isn't free. So don't do that. Instead, add unit tests as part of maintenance, bug-fixing and new development -- then it's free.

  • "Traning is an issue" In my experience, it's a matter of a few solid examples, and management demand for unit tests in addition to code. It doesn't require more than an all-hands meeting to explain that unit tests are required and here are the examples. Then it requires everyone report their status as "tests written/tests passed". You aren't 60% done, you're 232 out of 315 tests.

  • "it's only free on average if it works for a given project" Always true, good point.

  • "require more time, time aren't free for the business" You can either write bad code that barely works and requires a lot of support, or you can write good code that works and doesn't require a lot of support. I think that the time spent getting tests to actually pass reduces support, maintenance and debugging costs. In my experience, the value of unit tests for refactoring dramatically reduces the time to make architectural changes. It reduces the time to add features.

  • "I do not think either that it's ROI immediately" Actually, one unit test has such a huge ROI that it's hard to characterize. The first test to pass becomes the one think that you can really trust. Having just one trustworthy piece of code is a time-saver because it's one less thing you have to spend a lot of time thinking about.

War Story

This week I had to finish a bulk data loader; it validates and loads 30,000 row files we accept from customers. We have a nice library that we use for uploading some internally developed files. I wanted to use that module for the customer files. But the customer files are enough different that I could see that the library module API wasn't really suitable.

So I rewrote the API, reran the tests and checked the changes in. It was a significant API change. Much breakage. Much grepping the source to find every reference and fix them.

After running the relevant tests, I checked it in. And then I reran what I thought was an not-closely-related test. Ooops. It had a failure. It was testing something that wasn't part of the API, which also broke. Fixed. Checked in again (an hour late).

Without basic unit testing, this would have broken in QA, required a bug report, required debugging and rework. Look at the labor: 1 hour of QA person to find and report the bug + 2 hours of developer time to reconstruct the QA scenario and locate the problem + 1 hour to determine what to fix.

With unit testing: 1 hour to realize that a test didn't pass, and fix the code.

Bottom Line. Did it take me 3 hours to write the test? No. But the project got three hours back for my investment in writing the test.

S.Lott
Unit Testing is free ? If I were the devil's advocate I could argue that the time spend coding the test could have been taken to code the next feature on the list (so more feature per hour == more productivity) and that bugs to be found will be charged on the already support budget so it's not free.
Julien Grenier
Sorry. It's free. It doesn't require a lot of tools or even a lot of training. You take the money out of the support budget and put it into development. Make the support people do help desk or server consolidation projects instead of debugging.
S.Lott
No, it's absolutely not free. What about legacy code? Traning *is* an issue, and while it can be a good time investment, it's only free on average if it works for a given project. I don't think the ROI is quite immediate either.
Draemon
Lot of your free, require more time, time aren't free for the business... I do not think either that it's ROI immediately...
Daok
@S. Lott What is a "support cost" ?
leeand00
@leeand00: A support cost is a cost to support the software. QA, debugging, help-desk, problem tracking, defect management, production turnover, all that "support" which involves substantial "cost".
S.Lott
+1  A: 

Are you looking for something like this?

Patrick Cuff
The second reference is http://www.ibm.com/developerworks/rational/library/edge/08/may08/dunn/index.html which is cool, but treats "Agile" as a whole, not individual techniques.
S.Lott
A: 

There is such a thing as “local optimum”. You can read about it in Goldratt book Goal. It says that innovation is of any value only if it improves overall throughput. Decision to implement new technology should be related to critical paths inside of projects. If technology speeds up already fast enough process it only creates unnecessary backlog of ready modules. Which is not necessary improve overall speed of projects development.

Din
+1  A: 

You're assuming that the list you present constitutes a set of "best practices" (although I'd agree that it probably does, btw)

Rather than try to cherry-pick one process change, why not examine your current practices?

Ask yourself this:

Where are you feeling the most pain? What might you change to reduce/eliminate it?

Repeat until pain-free.

Mike Woodhouse
A: 

You don't mention code reviews in your list. For our team, this is probably what gave us the greatest ROI (yes, investment was steep, but return was even greater). I know Code Complete (the original version at least) mentioned statistics relative to the efficiency of reviews in finding defect VS testing.

Kena
A: 

I wish I had a better answer than the other answers, but I don't, because what I think really pays off is not conventional at present. That is, in design, to minimize redundancy. It is easy to say but takes experience.

In data it means keeping the data normalized, and when it cannot be, handling it in a loose fashion that can tolerate some inconsistency, not relying on tightly-bound notifications. If you do this, it simplifies the code a lot and reduces the need for unit tests.

In source code, it means if some of your "input data" changes at a very slow rate, you could consider code generation, as a way to simplify source code and get additional performance. If the source code is simpler, it is easier to review, and the need for testing it is reduced.

Not to be a grump, but I'm afraid, from the projects I've seen, there is a strong tendency to over-design, with way too many "layers of abstraction" whose correctness would not have to be questioned if they weren't even there.

Mike Dunlavey
A: 

One practice at a time is not going to give the best ROI. The practices are not independent.

Stephan Eggermont