views:

399

answers:

10

As today's code is getting more complex by the minute, code needs to be designed to be maintainable - meaning easy to read, and easy to understand.

That being said, I can't help but remember the programs that ran a couple of years ago such as Winamp or some games in which you needed a high performance program because your 486 100 Mhz wouldn't play mp3s with that beautiful mp3 player which consumed all of your CPU cycles.

Now I run Media Player (or whatever), start playing an mp3 and it eats up a 25-30% of one of my four cores. Come on!! If a 486 can do it, how can the playback take up so much processor to do the same?

I'm a developer myself, and I always used to advise: keep your code simple, don't prematurely optimize for performance. It seems that we've gone from "trying to get it to use the least amount of CPU as possible" to "if it doesn't take too much CPU is all right".

So, do you think we are killing performance by ignoring optimizations?

+33  A: 

Clean code doesn't kill performance. Bad code kills performance.

Otávio Décio
Well put. A clean algorithm will outperform a "tuned" bad algorithm. When it comes to GUI applications, some developers don't understand how data- and event-driven user interfaces are assembled, and the "brute force" alternatives result in what you're describing. That's not code clarity, it's code naivety.
280Z28
A: 

Personally, I always strive for a balance between performance and maintainability. We've long gone from the days where CPU time was expensive and programmers were cheap, but as a user as well as developer, it's frustrating as hell to find the same tasks taking longer on newer editions of the same software running on newer, faster hardware ... So, subjectively, yeah, I think some companies have gone too far in the other direction.

Adrien
I think performance correlates with maintainability, not against.
Mike Dunlavey
For the most part, yes, but what I was getting at (poorly) was more along the lines of what you pointed out in your own answer; I've seen "super-maintainable" code that was not only slow, but, to me at least, more difficult to follow. As I said, "balance". To misuse the 80/20 rule, the majority of apps don't need the ultimate in performance, but, as object-orientation has become ubiquitous, applications have become (from the user standpoint, at least) slower and more bloated, while hardware speeds continue to improve. Where is the performance disparity coming from?
Adrien
+1  A: 

Developers shouldn't be afraid of optimizing their applications. The bloat and slowness of today's apps is appalling.

Lance Roberts
++ Not to repeat being redundantly repetitive, but you're right. http://stackoverflow.com/questions/926266/performance-optimization-strategies-of-last-resort/927773#927773
Mike Dunlavey
Developers should be afraid of optimizing. Optimizations need to be prioritized along with everything else that needs to be done, like more/better features, bug fixes, usability testing, etc. Developers shouldn't be the ones making the decision to optimize, unless they are also the product owner.
nicholaides
Optimizations after-the-fact have to be prioritized along with everything else, but optimizations should always be implemented as a developer develops as the standard course of action. We shouldn't write non-optimal code on purpose (without overwhelming reasons). Product owners shouldn't be messing around in the development details.
Lance Roberts
+2  A: 

Specifically with regard to the mp3 player, you're probably not comparing like with like. Your old 486 mp3 player did little but play the mp3, Media Player carries a whole bucketload of cruft doing fancy effects, aero interface and all that stuff. Not to mention it's probably phoning home and a dozen other places on the planet to let Microsoft know what you're listing too :-)

Actually I think this is true more generically, the sort of UI experience we've come to expect today comes at a price both in terms of cpu and memory. I think this will be far more significant than any extra overhead from code structuring (and our compilers a a whole lot more clever too than they were 10 years ago so I even doubt that it is a factor at machine code level)

Cruachan
That's what I was thinking. iTunes uses up 9x - 10x as much CPU with the visualizer as it does without. And that's with filters (EQ, etc.) to enhance the sound that probably weren't being used as much in the 486 days.
Jarrod
+17  A: 

I've found quite the opposite to be true. The simplest code to read and maintain has tended, in my experience, to be the most performant overall. It is the hard-to-read gigantic balls of mud that tend to have performance bottlenecks in weird places that are almost impossible to remove or refactor, so they just get left there.

Lee
++ I couldn't agree more.http://stackoverflow.com/questions/926266/performance-optimization-strategies-of-last-resort/927773#927773
Mike Dunlavey
I prefer to call them hairballs, but +1
kenny
+1  A: 

Good looking code can be fast code. The problem can be many things:

  • Higher-level languages greatly ease development time but can cost processor time. For a large number of applications, this is a great trade-off
  • Programmers aren't as educated on algorithms as they used to be - this could be related to the high level languages, as people just use their language's built-in sort() instead of choosing quick sort over insertion sort
  • Applications do a lot more now. I'm pretty sure Media Player has more features than an old version of WinAmp

I wouldn't say that fast code is dead. For counterexamples, look at operating system code (the O(1) scheduler in Linux comes to mind) and of course game code.

Chris Simmons
+3  A: 

If you're a fan of winamp, you might like to read this great article about Justin Frankel's interesting times at AOL after AOL bought WinAmp.

His latest product is Reaper.

Optimization makes the most sense when the platform is fixed for a long time and you can really learn it. This still happens in console games.

Having written a lot of tight assembly language for games, I can tell you it takes time. You write the same code over and over and change your data structures around, trying to get a great framerate.

There is no such pressure anymore on PC apps. The assumption is that the extra work put in will rarely pay off, that anyone who wants fast will buy a faster computer.

Nosredna
A: 

I know of no current case where a good compiler will not produce fast, efficient code if given clean, well-written source code.

Now if you use some form of code generator, it would depend on the "goodness" of the source output of the generator. Certainly in the past I have seen code generators that created tons and tons of garbage code for seemingly simple operations. I think the tool designers were suffering from "everything AND kitchen sink" disease. Modern tools should be leaner, but it pays to check the tool before plunking down big bucks.

Again, if you write your own code, every compiler I am aware of today will take good, clean code and create well-optimized, fast executables. (unless you turn off all the optimization for debugging purposes or something like that).

Cheers,

-R

Huntrods
Compilers are great, but that doesn't mean they can produce lightning-fast code on their own. It still takes work to make an app that is efficient with media (audio dsp, video, animation). A lot of work.
Nosredna
I find good performance comes with using good data structures, writing efficient algorithms, and not doing any more *work* than is necessary, and optimizing code after identifying bottlenecks using profilers. I don't find good performance is related to how clean the code is.
Alex Black
A crappy algorithm will be slow, no matter how "gee-wiz cool" the compiler and perfectly-maintainable the code.
Adrien
A: 

In my experience with non-academic software, the biggest performance killer is the use of many layers of abstraction, each of which exacts a modest performance penalty, but they combine like compound interest. Each may be considered a "nice thing" and the "recommended way to do things", until you see the price being paid for it.

You see this especially in event-driven designs, where innocent-looking things like setting a property have a cascade effect throughout a network of classes.

Mike Dunlavey
A: 

It's easy to confuse over-the-top-designed code with clear code. The former is often hard to maintain and can produce enigmatic bottlenecks. But the UML diagram probably looks really neat.

sharkin