I know the question about measuring developer performance has been asked to death, but please bear with me. I know the age old debate about how you cannot measure performance of developers, but the reality is, at our company there is a "need" to do that one way or another.
I work for a relatively small company (small in terms of developers), and management felt the need to measure developer performance based on "functionality that passes test (QA) at first iteration".
We somehow managed to convince them that this was a bad idea for various reasons, and came up instead on measuring developers by putting code in test where all unit tests passes. Since in our team there is no "requirement" per se to develop unit tests before, we felt it was an opportunity to formalise the need to develop unit tests - i.e. put some incentive on developers to write unit tests.
My problem is this: since arguably we will not be releasing code to QA that do not pass all unit tests, how can one reasonably measure developer performance based on unit tests? Based on unit tests, what makes a good developer stand out?
- Functionality that fail although unit test passes?
- Not writing unit test for a given functionality at all, or not adequate unit tests written?
- Quality of unit test written?
- Number of Unit tests written?
Any suggestions would be much appreciated. Or am I completely off the mark in this kind of performance measurement?