Bounty clarification
I know it's a subjective question. The ideal answer I'm looking is one that explains why the quoted scenario here would be so surprising.
If you think the quoted scenario is it fact not surprising and to be expected, please break down the steps to prove how such a little app can take over a month and several thousand dollar of development. I went quite far to do the calculations (for example, looking up minimum wages) so I expect the ideal answer to do similar.
If you think the quoted scenario is indeed overestimated, please pinpoint exactly your reasons. What mistakes can you spot in his calculation that led to such a huge cost for a simple application like that? How would you have done it differently? (no need to write the whole process but details instead of generalized feelings would be nice)
I know questions about FPA has been asked numerous times before, but this time I'm taking a more analytical angle at it, backed up with data.
1. First, some data
This question is based on a tutorial. He had a "Sample Count" section where he demonstrated it step by step. You can see some screenshots of his sample application here.
In the end, he calculated the unadjusted FP to be 99
.
There is another article on InformIT with industry data on typical hour/FP. It ranges from 2 hours/FP to 27.4 hours/FP. Let's try to stick with 2
for the moment (since SO readers are probably the more efficient crowd :p).
2. Reality check!?
Now just check out the screenshots again.
Do a little math here
99 * 2 = 198 hours
198 hours / 40 hours per week = 5 weeks
Seriously? That sample application is going to take 5 weeks to implement? Is it just my feeling that it wouldn't take any decent programmer longer than one week (I"m not even saying weekend) to have it completed?
Now let's try estimating the cost of the project. We'll use New York's minimum wage at the moment (Wikipedia), which is $7.25
198 * 7.25 = $1435.5
From what I could see from the screenshots, this application is a small excel-improvement app. I could have bought MS Office Pro for 200 bucks which gives me greater interoperability (.xls files) and flexibility (spreadsheets).
(For the record, that same Web site has another article discussing productivity. It seems like they typically use 4.2 hours/FP, which gives us even more shocking stats:
99 * 4.2 = 415 hours = 10 weeks = almost 3 whopping months!
415 hours * $7.25 = $3000 zomg
(That's even assuming that all our poor coders get the minimum wage!)
3. Am I missing something here?
Right now, I could come up with several possible explanation:
- FPA is really only suited for bigger projects (1000+ FPs) so it becomes extremely inaccurate at smaller scale.
- The hours/FP metric fluctuates abruptly from team to team, project to project. For a small project like this, we could have used something like 0.5 hour/FP or something. (Now this kind of makes the whole estimation thing pointless, unless my firm does the same type of projects for several years with the same team, not really common.)
From my experience with several software metrics, Function Point is really not a lightweight metric. If the hour/FP thing fluctuates so much, then what's the point, maybe I could have gone with User Story Points which is a lot faster to get and arguably almost as uncertain.
What would be the FP experts' answers to this?