A team with say 100 people will be less productive (in terms of productivity per each one of them) than a team with just 5.
So the question is, is there a way (a formula, a standard, or at least a hint) to estimate that overhead?
A team with say 100 people will be less productive (in terms of productivity per each one of them) than a team with just 5.
So the question is, is there a way (a formula, a standard, or at least a hint) to estimate that overhead?
Good question, presumably based on the fact that the number of communications pathways increases exponentially with team size, and this affects the team's productivity (i.e. more time wasted communicating).
In terms of formulae, you could use this for the number of interconnects:
With this in mind, there is a school of thought that advocates an ideal team size of 5.
More a suggestion than an answer.
I'm fairly sure that the book Code Complete discussed this, and probably had links to some research to back it up as well, however, I'm not sure where my copy is right now so I can't check it up, though if you've got access to it, take a look.
Otherwise it might be worth doing a search for Steve Mcconnell's webpages/articles.
Only my opinion and experience...
In my experience smaller teams tend to work on smaller projects, smaller projects tend to be easier to architect and develop and still allow all team member to understand most of the project. As projects get larger more coordination is required since not everyone can understand the whole system. As teams get larger this adds administrative overhead, which should be included in the team size and thus productivity measurements.
The idea was popularized in Brooks' The Mythical Man-Month. If I remember correctly, his estimate was that three people could do twice the work of one, and so on up (nine to do four times the work of one). It seems like an extreme degradation, and I don't remember him providing supporting evidence.