views:

281

answers:

10

Hi,

I'm consulting at a traditional business that has almost zero understanding of software development. They are justifiably concerned about how to measure our progress, & productivity and are currently looking for way to measure this.

Obviously, I'm concerned they will adopt some easy to measure, but bogus strategy. (ie: Lines of Code, etc...)

I keep our bug tracker updated, and record estimation min, max, probable, & actual, so we always know the amount of work in the que, and it's easy to calculate our estimate inaccuracy.

I'm suggesting watching the number of bugs added, the number of bugs resolved, the bug added/resolved difference, and bug count spikes after each deployment (it's an internal web app). The bug count spikes isn't really a how many, but if the bug spikes are being reduced on each bug fix cycle. And how much time bug fixes take after each deployment. This can be used to estimate expected post deployment work.

Does anybody know an easy to understand way to measure the development process in terms a layman can understand? As you can probably tell from the above text, I want an honest and transparent way to do this.

Thanks in advance.

Regards, John

PS-If this exact question has been asked, please point me in the right direction.

EDIT-To clarify something I didn't mention in the original post :

-The bug tracker is also tracking new features and change requests. I added a custom field to say if it's a bug, change, feature, task, or support ticket.

-When I mentioned the bug count spikes, what I'm suggesting is focusing on the series of bug counts immediately after release. So something like 10, 5, 3, 1, 1, 0 is a lot better than 10, 11, 9, 8, 9, 6, etc...

-Also, this is a small 3 person team. 2 developers plus our manager who is an IT admin. So many of the colaborative approaches will not work.

Thanks again for the contributions.

A: 

it sounds like you're on the right track, assuming the customer can understand non-technical terms like "time" and "bugs". If you are enhancing the software in terms of features this may also make sense to them. 45% completed appears to be informational but it is really a meaningless number no matter how accurate it may be as a measurement - a list of the 15 features planned for the current iteration along with the 7 features completed so far might be a warmer and fuzzier approach.

Steven A. Lowe
Thanks. I'm starting to lean towards just listing what is finished, and what will be done next. Also, I agree about staying away from '% completed', its just asking for trouble, since it feels like you are ALWAYS 95% finished. ;-)
John MacIntyre
+2  A: 

You might consider creating a set of milestones, where specific functionality will exist, and can be verified by the client.

EvilTeach
+3  A: 

Beware of focusing on number of bugs fixed. It's a good way to encourage development to fix trivial issues and ignore difficult ones. Speaking from experience, it's a brilliant way to compound technical debt by piling superficial "progress" on top of unsound foundations.

In fact, putting too much emphasis on any single metric is likely to have similar, unintended, consequences.

Dan Dyer
Totally agree. What I'm suggesting is focusing on the series of bug counts immediately after release. So something like 10, 5, 3, 1, 1, 0 is a lot better than 10, 11, 9, 8, 9, 6, etc... PS-thanks for the link, I've never read his website before.
John MacIntyre
+1  A: 

See How do you report your project status?, and Life Cycle Tools Suite for relevant questions.

Everyone wants a simple measurement of progress and productivity. If such a thing existed, we'd all be using it.

You've mentioned tracking bugs. But bug fixes are not as valuable as new features and business problems solved. Where and how are those tracked?

I'm a fan of Release Burndown charts. I think they show useful information at a more interesting level than individual bug fixes. They show significant releases, things users actually see and prioritize.

S.Lott
I'm using bug tracker for everything including features. Everything is in there along with the estimates. Technically, its working very well, however, I won't do it again, since even though I've never had more than 8-10 bugs open at any time, everybody thinks I've got 80.
John MacIntyre
+1  A: 

Use an Agile approach, and measure progress by user-stories successfully delivered to the business. Where user-stories are worded in business terms, so it's easy for the business to see that value is being delivered.

Everything else is just indirect measures of what really counts.

Paul

Paul
A: 

Bugs reported and fixed is just one aspect. Features specified and delivered is another. Make sure you describe both in terms the customer can understand and prioritize.

You can then do iteration planning for short iterations (week or perhaps two).

Put each feature and issue on a card. Have the customer prioritize the cards using card sorting. If there are a lot of cards, have him pick the top ten or so.

Have the developers estimate the size/effort of those cards in story points using planning poker (every one estimates in parallel in secret and all estimates are revealed at the same time. Large differences means the card isn't clear and should be explained more). Make sure cards that are too large are split up. Smallest size should be a few hours/half a day.

From the previous iteration you know how many story points were done, and from the availability of developers in the next iteration you know how many cards you can expect to finish (team velocity).

Another thing: to show quality issues you might want to indicate sub-system(s) affected by the bug or feature. Real tracability may sound great, but is in practice often too much effort. The customer should make a conscious decision if that's worth it.

Put the info on the wall, not just in a bug tracker. That way everyone coming into the project room has an instant overview of the project status

Stephan Eggermont
+2  A: 

The best way to measure progress and communicate how much work is being done to non-technical staff or people is to estimate your bugs / tasks using story points. Story points mean that you take your bugs and feature requests and measure how "big" they are as opposed to how long time it will take to implement a feature or fix a bug.

It sounds a little akward at first talking about "size" instead of "time" but after a while (2-3 months) you are able to track quite accurately how much work is being done and (most importantly) you are able to measure and forecast your expected future productivity. This approach is well documented in books about agile estimation - get the one written by Mike Cohn if you are interested.

If that doesn't fit your needs you should get the stakeholders to provide a businessvalue to each of the bugs or feature requests (Keep It Simple Stupid - a value ranging from 1 to 5 is all you need) - how much value does this feature / bugfix add to the business?. They are able to to that without knowing anything about tech stuff. Once every bug / new feature has a business value attached you can track down how much business value you've provided as a developer during a fixed period of time.

Uhmm, you should do both. Business decides priority, development size/effort.
Stephan Eggermont
Are you saying that we should rate the features according to business value (1-5), then summarize the value we add each period? I'm familiar with this, but never really thought about aggregating it to measure output per period. This is so crazy it may just work! ;-)
John MacIntyre
A: 

Make frequent intermediate releases. Ask the customer to try it out and find out if it satisfies his requirements.

Manoj
A: 

One thing to watch out for when demonstrating software progress is that non-technical folks often have trouble differentiating between an application's presentation and an application's completeness. Thus, it is a good idea to avoid making nice-looking buttons and screens for features that are not complete. One rather easily understood manner to present progress in terms of features is to just add buttons and screens for new features. In other words, have your application's GUI closely follow your application's progress. If this is impractical, do the same thing but use mock-ups in place of a working application or real screenshots.

Brian
+1  A: 

i can see you are a small team, 2 developers. so adding any new procedure which takes away coding time from your programmers is counter-productive in some ways. i dont doubt that some of the techniques suggested here would work great, but your programmers are probably very busy and would rather be coding then filling out cards or making multiple estimates.

what people are saying about bugs only being part of the picture is true, but you deserve credit for using a bug tracking system in such a professional manner - well done john.

also, you asked about a laymans measure. it doesnt get any 'layer' then % of tasks completed. a client will very much understand this kind of update:

  • In total, 55% of all tasks on your project have been completed.

of course, to make this kind of estimate, you need to have a project schedule which lists all the tasks on the project (see http://pm4web.blogspot.com/2008/07/google-spreadsheets-for-project.html). this is based on joel spolskys method, but has some additional metrics in it which anyone can understand (such as how many hours are left in the coding phase, how many hours are assigned to different programmers, etc).

someone also suggested here using milestones. they are an excellent idea and are a tangible reflection of progress (not necessarily quality). i tend to do milestones like 'feature complete preview (pre-QC cycle)', 'preview release (post-QC cycle)', etc, 'final release'.

any method of progress estimation is only going to be accurate to a certain degree, and the more accurate ones tend to take more time to do. and thats the key issue, time - personally, i think you are better off spending time doing more testing then status analysis.

LM

Our bug tracker is setup based on the same Spolsky post. thx.
John MacIntyre