My project has some rather open-ended requirements in terms in file size limits and performance. In the end, the performance varies by platform and typically execution time is the only issue. Faster platform equals faster execution. Our business guy says we should find our current maximum capability, make sure we exceed the previous legacy application we're replacing, and set the bar a little higher than legacy. In every new release we can give them more until we hit our max. Nobody has to know what we can really do that way we can show improvement over time. This makes me feel a little dirty.
UPDATE: better example The situation is more like "Legacy applications could only do 2.5M. We can process 20M records NOW, but we'll set the limit to 5M so that in 6 months we can put out a new release that can do 7M, etc, etc. We're still above legacy"
We're still giving the customer what they want, but we could give them more. We're not giving them more NOW so we can give them more in the FUTURE and look like heroes. Something about this feels wrong from a academic/engineering/science point of view.
How do other people feel about this? Has anyone encounted this or similar "working the numbers" type situations? Should I just get over this?
NOTE: Not sure the title is 100% correct but I couldn't find a good way to word it. Help?