views:

248

answers:

11

Just wondering:

In which fields of programming does performance still matter? I know about some of the classic fields like multimedia/games, embedded-systems and so on, but that can't be all.

Do you know about any niches where performance is still relevant?

+6  A: 

Performance still matters, and it will matter in some years to go.

Software is layered. If you ignore performance in a single layer, then well, go on and live with it. Its ok!

But if code of each layer performs bad, then the software will become unusable.

So, performance matters much in the lower layers, but it still matters in the intermediate layers, and if your high layer code should be competetive, than it still matters in high level code.

frunsi
Are you saying that there'll come a day when performance *doesn't* matter? Performance will always matter. No matter how fast computers become, it'll always be possible to write software so efficient that they're not fast *enough*.
jalf
Jalf: yes and no. Performance will matter in the computational models that we are bound to at the moment. So performance may matter some hundreds or thousand of years in the future! But there _may_ be a singularity where a computational task will not be bound to time anymore. You know that most answers already exist at the moment someone asks it. This may lead to new concepts and models. We don't know.
frunsi
I'd say that's so hypothetical that including it in your answer is just misleading. You might as well argue that because we may one day have time travel, performance won't matter because we can just go back in time and start the task earlier. Also of course 'efficient' should've been 'inefficient' in my first comment. Oops.
jalf
I can write `while (true) { ; }` in both low level and high level languages :)
Seth
jalf: I did not include any hypothetical aspect in my original answer. I just did not want to use words like "always", "forever" and ever and so on. Please do not twist my words.
frunsi
+3  A: 

Certainly scientific computing of all kinds. The size and complexity of the problems attempted only increases every day. More degrees of freedom, multi-physics - all soak up gobs of computing power.

Even the web apps that I write still demand some decent response. Who wants an SLA greater than five seconds? I'm so ADD and twitchy these days that I leave any site that makes me wait longer than that.

That comment had it right: What apps DON'T care about performance?

duffymo
+5  A: 

In all of them, I would say.

A bad algorithm can bring down even the fastest computers. The programmer needs to know the difference between an O(n^2) and an O(n) algorithm, and be able to recognize the difference.

Seth
+1  A: 

Performance always matters. Every program can become so slow that it is unusable. It's just the level of performance that differs. In many applications, we don't really care if an operation takes 0.01 or 0.5 seconds to execute, even though that's a factor 50x difference in speed. Both are fast enough. But if the same operation took two hours, we'd feel it was a problem.

Performance is relevant. But it's not an absolute. Sometimes we need every bit of performance we can squeeze out of the code. Other times we just need it to be fast enough. Sometimes "fast enough" means "must finish within 20 milliseconds", and other times it might mean "must finish today".

jalf
+2  A: 

What would you consider more important than performance? Maybe:

  • Functionality
  • User-friendliness
  • Security
  • Maintainability
  • Reliability
  • Programmer’s time

    ... among many other indispensable things?

Actually, for many applications, most of these are obviously more important than performance. Who would prefer a fast banking application but not properly secured?; or a fast, but crude, UI for a web application?; there is little demand for a database engine that performs well but gets corrupted easily.

The interesting thing is that performance can often be traded for things like the above.

It was the availability of abundant performance that allowed software engineers to "waste" CPU cycles to render nice user-friendly graphical interfaces a few decades ago. It is the same abundant performance that is making the overhead of virtualization software nearly negligible right now.

The bottom line is that for any given application, performance still defines the line between what is feasible and what remains unfeasible from the above list. While it is worth noting that Moore's Law (read "Hardware") has a big influence on the availability of abundant performance, it is still the efficiency of software that manages to push the boundaries of feasibility.

Daniel Vassallo
A: 

It certainly mattered a lot in everything I've ever done. That's not to say premature optimization doesn't still exist, but I think the rule: working, right, fast (enough), still applies.

Cade Roux
+1  A: 
  1. Web-based applications handling many users and many interactions. In general, the performance can be increased, if properly built (scaleable) by increasing the number of servers/processes, but then the cost of running the application is greater since more hardware is required.

  2. Datamining. These algorithms are generally very slow, and must process a great deal of data.

  3. Batch applications - e.g. processing end of day transactions for a bank. These need to be efficient, to keep down hardware costs and to ensure that all data is processed by the next business day.

  4. Scientific computing, as mentioned by an earlier poster. Some systems require large numbers of equations to be solved simultaneously.

  5. Vision systems. These can be computationally intensive, and especially in a real-time robotic system need to be fast enough so the robot can respond to the environment.

  6. Database systems. These are sold on performance and features, so the faster your database can answer queries, the better. Again, this will reduce hardware costs for the end user, as a faster system will require fewer clusters.

  7. Trading systems. These systems must analyze data very quickly, so that trading opportunities are not lost because of the lag in performance.

Larry Watanabe
A: 

Many problems can be modeled as a special case of edit distance. This includes plagiarism detection and finding syntax errors in a program. Edit distance is O(n!) with the obvious algorithm, O(n) with a proper dynamic programming algorithm. n! isn't practical for anything where n is more than 15 or so.

RossFabricant
A: 

A good reason to care about performance is the fact that someone will find a use for it. Think about it - pretty much all computers since 2000 are able to do basic web surfing, word processing, and even basic 3D games. So why do we keep upgrading our computers every couple of years? Because there new applications keep coming out as a result of increasing performance - raytracers, HD video editing, etc.

Now, the above statement seems to apply to hardware, but you, as a programmer, are also responsible for performance. If you don't write well-performing code, people won't be able to leverage your code to create these new applications.

Mike
A: 

Performance / resource usage still matters when going really big or really small.

If you look at big "things" like a google web application, there the performance penalty is payed with the numbers of servers used. If the program is 10% "slower", you are going to add 10% more servers and if you are google that is usually a couple of thousands of servers...

And going small, we have the different embedded-systems (but since this example was mentioned in the question I leave it at that).

Johan
A: 

Any real time system. Think missile guidance systems or pacemakers. This doesn't mean the programs need to be amazingly fast, but being fast enough is critical. I suppose if you are google or youtube any performance gain is pretty important too.

jacob