views:

417

answers:

14

I seem to notice two schools of thought emerging in regards to optimization:

  1. Premature optimization is the root of all evil. You should only optimize when you've written the most readable and simplest thing possible. If after profiling you determine that the software is too slow, you should optimize.
  2. Optimizations should be done early in a project's lifecycle. Optimizations need to be planned for, but should be done reasonably.

On the face of things, they seem to be fairly opposed viewpoints. The thing is, I see the merit in both schools of thought. I can also think of times when both of these ways of thinking have helped me write better and faster software.

Is there any way to reconcile these two ideas? Is there a middle ground? Is there a time when one idea is the best tool for the job? Or am I presenting a false dichotomy and both views can co-exist peacefully?

+1  A: 

I would say the middle ground would be to keep known inefficiencies in mind when you're writing code, but don't optimize early if it would mean extra time in initial development, or added complexity.

My theory is, "Write simple working code, then optimize as testing requires."

Alex Fort
+8  A: 

What I usually do is to apply those optimzations that don't cost me anything (or almost nothing). I also stay on the lookout for algorithms that don't scale well and are called very often. Other than that, I do not optimize until the software runs and I get a chance to fire up the profiler. Only then I will invest some serious time into optimization.

Adrian Grigore
+1 - Great + Simple Rules
Jas Panesar
+21  A: 

Optimise at the design and architecture level early on. Micro-optimise at the implementation level later.

You need to be aware of the performance costs of design decisions you make which will be hard to change later. Implementation can often be tuned later on, which is why it's not worth doing it until you know that it's a problem.

Jon Skeet
I would like to add that micro-optimization should only be done after measuring performance and only on performance bottlenecks.
Mendelt
Mendelt: Absolutely :)
Jon Skeet
This is probably the best answer. But I figured that I'd give to someone with a bit less rep.
Jason Baker
@Jon Skeet, what are your thoughts on views compiling times? Please check this question here http://stackoverflow.com/questions/3843546/how-much-time-to-compile-a-view-in-asp-net/3843928. Is this irrelevant, a micro-optimisation or design and architecture level optimisation?
Fabio Milheiro
@Fabio: Sounds premature to me. Have you investigated pre-compilation? I certainly wouldn't start changing the actual code design just for that.
Jon Skeet
@Jon Skeet, I am not planning to change code that it's already written, but since I will start coding a new web application more heavily, I thought that it could be useful. I haven't investingated pre-compilation, but I will do that. Thanks Jon! +1
Fabio Milheiro
+1  A: 

It should purely be a return on investment analysis. If you can put a little effort in to design optimization and reap a vast return in performance then do it. Gradually you will get to the point where the return for the amount of effort just doesn't make sense any more.

EBGreen
+4  A: 

Concentrate on writing code that does exactly what it's supposed to do, and only the required number of times. Optimizing clean, elegant code is usually simple.

krosenvold
+2  A: 

I like this guy's formulation.

  1. Optimization by using a more sensible overall approach.
  2. Optimization by making the code less weird.
  3. Optimization by making the code more weird.

The first two are you point 2. His 3 is your 1 and the one that is the root of all evil. And he's right, optimizations that make code "more weird" make it more complicated and more difficult to understand, resulting in more bugs and maintenance headaches.

sblundy
+1  A: 

There is one basic truth:

You cannot optimize what you cannot test

Therefore, as other have stated, and especially wrt to performance optimizations, you must write the code to then be able to test it. Moreso, in a large body of code, one algorithm may be the generally fastest one, but given how it inter-relates with other code, it is either a useless optimization which costs you time, or is slower than option 2, 3, ...

However, there is a body of knowledge out there that you can tap into, especially at the conceptual level, that can help you "pre-optimize" your design in a global scale.

Unfortunately, this is one of those debates that has no real closure.

alphadogg
+3  A: 

The ideal is to profile first, then optimize where necessary, but that doesn't work with design; by the time you've got anything executable to profile, changing the design would be very expensive. Therefore, the design has to pay attention to efficiency up front. Usually this means sketching out efficient algorithms up front, and keeping enough flexibility to change later. (This is often best done with good separation of function, keeping modules as independent as possible, and that's good design practice for other reasons.)

Typically, in the design phase(s), you'll have a good idea how important performance is. If you need to, you can design for performance from the start (which does not include code-level optimizations).

There's also the developing of efficient coding habits in choosing between two otherwise similar practices. For example, in C++ it's worthwhile typing ++i rather than i++, because it's a trivial thing that can be significantly more efficient sometimes.

Anything more than that should wait until (a) it's clear that improving the performance will pay off, and (b) you know where the hotspots are.

David Thornley
+2  A: 

I would also have: Use appropriate and efficent data structures from the start. This does cover a wide range of things:

  1. Know how all the standard containers work, what they're good at and what they're bad at. eg. SortedDictionary is quick at insertion and searching but poor at deletion. LinkedList is quick to add and delete but poor at searching and so on.
  2. Know where your bottle necks will be. Will it be cpu, disk, memory, graphics, IO, networking, etc. Know how to utilise each one efficently, each area requires different design patterns. This really depends on the application being developed as well - what is the core metric to concentrate on, for UI responsiveness, for data processing good disk caching.
  3. Multhreading. If the application will scale to multiple cores it needs to be decided upon very early on in the development life cycle if such a system is needed. Bolting threading on at a later stage is much more costly.

Some of the answers you will know from experience, some will require research, but it will never be guess work.

Skizz

Skizz
+1  A: 

Databases in particular are not easy to refactor and are often the largest bottleneck in the system due to designers who think they shouldn't care about performance when they design. This is short-sighted. There are many known database optimizations that will almost all of the time be faster. To not use them in your design and intial coding to avoid "premature optimization" is silly. For instance a cursor will almost never (unless you are looking for running totals) perform better than a set-based query in SQl Server. To write a cursor instead of a set-based query is not faster (once you understand set-based queries), so there is no reason to start with cursor based code. Same thing with derived tables vice subqueries. Why write code you know 90% of the time will be slower than other code which takes the same amount of time to write?

Choosing to use a tool that makes it hard later to performance tune is also a short-sighted decision, so in considering how you intend to access the database, this should be part of what you consider.

Anyone who codes against a database or who designs them, should take the time to read up on performance tuning for their particualr type of database. Knowing in advance how to write a sargeable query and what sort of things should have the indexes you you start with and what are the usual kinds of bottlenecks will help you do the job better the first time.

HLGEM
It's not hard to refactor a database, presuming you have taken a few up-front steps (such as use primarily views and deprecate direct table access as much as possible) and have a system for refactoring in flight. See http://www.agiledata.org/essays/databaseRefactoring.html for more.
alphadogg
Sadly too few existing databases are designed this way. Agree it is good to consider refactoring in design, but out of hundreds of databases I have had to query (I used to work for an audit agency and had to look at the design of many databases) I have not seen one that was designed this way.
HLGEM
+2  A: 

Adapting the quote "it's tactics if you win and it's cheating if you lose", I'd say

It's "planning for efficiency" if it worked and it's "premature optimization" if it didn't.

ephemient
+1  A: 

This is where planning comes into play. If you have a great plan and have a good model for what you're writing, optimization should only have to occur in post. If you find yourself needing to optimize a lot of your code you are most likely doing something incorrectly to start with.

A lot of this will also come with experience and working on similar tasks. Generally the only time you should have to write something that needs to be optimized is when you are covering things you've never worked with prior. Optimizations happen in the planning phase and in the post project phase IMO.

Syntax
+2  A: 

Build it as well as you can the first time without adding lots of time or effort. Then let it fight for your attention.

Jas Panesar
+1  A: 

The code , besides providing providing the basic functioanlity, has three more feature that software developer need to provide:

  1. Performance
  2. Maintainability
  3. Robustness

An idle code would provide all three of these. with limited resources, the call of which portion of code should be optimized for what needs to be evaluated. Optimization of one of these, at the cost of others, is dangerous, and should be avoided as far as possible.

Vardhan Varma
I'm assuming not necessarily in that order though, right? :-)
Jason Baker
true. I think it should be exact reverse order ...
Vardhan Varma