I work for a software development company and we have around 100 people working on a product, 1/3 of these people are QA. Lately management wants to have a better way to rate individual programmers performance so the suggestion was to use bug reports as a measurement. The more bug reports on a developer the worse he is. This seems ill-advised for more reasons than I can tell e.g. it is a subjective way of measuring, developers work on different projects of differing complexity. In addition if QA is measured for the number of bug reports they generate there will be a lot of discussions about the validity of bug reports.
What would be a better way to measure developers performance in such a setting?
One suggestion would be to not use bug reports from QA as a measure and instead use bug reports from outside, like beta testers then when such public bug reports are issued also let QA be measured by it.
EDIT:#1 After reading some of your excellent responses I was thinking that the general problem with the above described metric is that it is negative reporting bugs - it doesn't encourage producing good quality code.
EDIT:#2 I think the problem is that it is two worlds. There are the non-programmers on one side who treat programmers as workers basically, they want metrics loc/minute preferably. Then we have the Programmers, who want to see themselves as artists or craftsmen, "please don't disturb me I am c-o-d-i-n-g" :) I don't think measuring quality can be done by metrics not without being counterproductive. Instead things how a person reacts to bugs, willingness to change, creativity and above all quality of work are important and but mostly not necessarily measurable.