views:

140

answers:

2

I've been struggling with an application I'm writing and I think I'm beginning to see that my problem is premature optimization. The perfectionist side of me wants to make everything optimal and perfect the first time through, but I'm finding this is complicating the design quite a bit. Instead of writing small, testable functions that do one simple thing well, I'm leaning towards cramming in as much functionality as possible in order to be more efficient.

For example, I'm avoiding multiple trips to the database for the same piece of information at the cost of my code becoming more complex. One part of me wants to just not worry about redundant database calls. It would make it easier to write correct code and the amount of data being fetched is small anyway. The other part of me feels very dirty and unclean doing this. :-)

I'm leaning towards just going to the database multiple times, which I think is the right move here. It's more important that I finish the project and I feel like I'm getting hung up because of optimizations like this. My question is: is this the right strategy to be using when avoiding premature optimization?

+14  A: 

This is the right strategy in general. Get the code to work, thoroughly covered with automated tests.

You can then run the automated tests while the program is under control of a profiler, to find out where the program is spending time and/or memory. That will show you were to optimize.

And it will be showing you how to optimize working code, not code that may or not work when it's all put together.

You don't want code that fails optimally.

John Saunders
"You don't want code that fails optimally." - great way to put it!
Paperjam
@Paperjam: someone else said something similar. I just couldn't remember exactly what.
John Saunders
+2  A: 

We should forget about small efficiencies, say about 97% of the time: premature optimization is the root of all evil. -- Hoare

While @John Saunders nails it, applying TDD alone might not completely addresses your concerns. I adhere to TDD, and when you do TDD correctly, and if you can apply refactoring effectively, you typically end up with much leaner code, and with the benefit that you know it works. No arguments there.

However, I see too many developers write performance-ignorant code - avoiding premature optimization is no excuse for writing sloppy / lazy / naive code. Writing unit tests doesn't prevent this. Although someone who writes unit tests is probably a better coder, and better coders are less apt to write bad code as often.

Do write tests, and do include performance testing into your suite of tests for the scenarios your stakeholders identify. e.g. retrieve 100 discounted products for a specific vendor, and include stocking levels and format as Xml in under 3 seconds

The fallacy that "premature optimization" is the same thing as "concern about performance" should not guide software development. -- Randall Hyde

If you leave performance concerns too late, you may find it's too hard, or too costly to change.

Some articles

Robert Paulson
@Robert: I'm afraid I must disagree in part. Writing "the simplest thing that could possibly work" may very well lead to writing naive code. But by using TDD you will get excellent code coverage - enough that you'll be able to use those tests to drive the performance investigation and correction process.
John Saunders
@John Saunders I don't know what you're disagreeing with? My answer tries to encourage the OP to write tests based on stakeholder performance expectations. I've reworded for clarify.
Robert Paulson
@Robert: I might clarify by saying that your code has to pass all tests - including performance tests. However, a reasonable balance must be struck between getting the code to work and getting the code to work well. Poorly performing code might need to be redesigned to meet performance goals. But if it passes all automated tests, then the redesign can proceed as a refactoring, with much greater confidence that the code still works.
John Saunders
@John Saunders I still agree with you, and I don't believe anything I've said has contradicted that.
Robert Paulson
@Robert: just a matter of emphasis, perhaps. I would defer performance testing until the code works. I would certainly learn from experience: one may have learned certain designs or certain code patterns are expensive, in which case, don't use them. But beyond that, I would want to make sure I'm optimizing working code.
John Saunders