tags:

views:

849

answers:

7

Every day after I finish a hard days coding (or an easy days youtubing) I log work done against the tickets I have been (ostentibly) working on. Then Mr. Manager takes this oh-so-useful information and builds a burn-down chart. This then allows us to track how 'complete' we are in the overall scope of things. It also allows us to measure our velocity (essentially How Fast We Get Things Done).

I've noticed that some people complete their tasks well under the original estimate (which makes us look good) and some people complete their tasks over the original estimate (which makes us look bad).

While this measures how quickly people finish tasks it doesn't measure how well. It really gives you no solid indication of developer performance.

Neither does lines of code (you don't want devs to be overly verbose).

Neither does bugs raised (you are now basing the performance of a developer on the performance of a tester; they could not raise enough or raise too many).

So how would you go about measuring developer performance?

+1  A: 

Try the answers from this question.

Pat
+1  A: 

I don't think there's an automated test that's going to get you a 100% accurate assessment. Combining some of the methods you mentioned like bugs addressed/time spent is reasonable, but more importantly, individual assessments by senior developers who know what to look for in terms of code quality/quantity is essential for not only the performance, but also the growth of developers.

eplawless
A: 

I measure their ability to estimate then if they hit the task complete on time.

Ability to estimate gives me the proper project timeline line, essential for talking with the business. Then if they hit that for code complete.

Look at fogbugz! Software management by software devs.

+3  A: 

Try searching Joel's blog ... you should find a very interesting post about how you CANT measure a programmer's performance. However (and yes I'm plugging Joel's software), you can make better and better estimated guesses the more and more history that gets accumulated for a given programmer.

Check out this link and read the white paper at the bottom for more info.

xanadont
+17  A: 

You can't really measure developer performance because as soon as you add metrics, the developers latch onto those metrics and use them to boost measured performance. The most likely people to do this are the ones doing the worst jobs.

There's reams of info out in the internet on this subject, but the first step is to ask yourself why you need this information. Is it to give you an idea of project end dates or is it to rate or rank developers.

If the latter - Good luck with that - Any metric you create will be too simple & too esasily gamed. These metrics will also fail to measure how good a job a developer is doing. That'll foster bad feeling amongst developers and it'll be counterproductive.

If you're trying to get an idea of how long a job is going to take... Here's what I do.

  • Split a job into manageable tasks. Each task should take no longer than three hours. Each task should be defined in terms of the number of minutes it could take to complete. If a task takes more than three hours, you need to split it into subtasks.
  • Round all minutes up to the nearest 15 minute increment.
  • Add all the minutes from everyone together. (This step is important)
  • Divide by 60 to give a total number of hours.
  • Double it.
  • I give myself 5 working hours in a day - doesn't sound like a lot, but you'll be amazed at how many interruptions there are in a normal day.

For me, this usually works out about right. Once you've got the value above, you can multiply it by a factor calculated from the last three projects to allow for team bias (too optimistic / too pessimistic).

On developer performance. I don't really understand why you're concerned about developer performance. There's no need for you to be trying to highlight bad developers using the power of math. If you think someone is swinging the lead, bring it up with your manager and leave it at that.

If the management does nothing about it and it's harming your health / job / career, then it's time for you to move on.

seanyboy
A: 

What exactly do you expect to achieve with any of these measures? Every task a programmer gets is a story of it's own. You may get a task that is estimated to be finished in 3 hours, but you run into a bunch of small little problems that everyone overlooked and you finish in 5 hours, this can happen a lot on poorly managed projects. How do you know if your project is poorly managed? You can try measuring lines of code, the only thing you'll achieve is you'll get the worst code you can get, because if only one person of you team suspects you are doing this everyone else will start acting like they are being measured by lines of code.

Vasil
A: 

This isn't about finding out who's a lazy or crappy programmer, which, interestingly, is what you folks (programmers? ha ha) are latching onto. :)

Imagine a company with a billions of dollars in IT budget. Billions. Some of it is dev, some support, some hardware/infra- just to give you a large, rough example... How do they measure developer productivity?

I'll ask the problem differently- The need more than the want (which is always a problem with requirements, right? But I digress.)

You have a very large IT organization. You need a set of metrics/KPIs- whatever you call them- that let you baseline developer productivity and then track it over time to see if the projects you put in place to improve productivity are effective. Does it help to give them free pizza and Mountain Dew? Does it help to give them all the same development tools? Does your IT shop become more effective if you move to Agile methodology and outsource everyone to Mongolia, Bangalore, or Mississippi?

The ask is for repeatable, reliable, actionable metrics.

Suggestions? / Go.

Mr. Metrics.