views:

681

answers:

13

Adage made by Niklaus Wirth in 1995:

«Software is getting slower more rapidly than hardware becomes faster»

  • Do you think it's actually true?
  • How should you measure "speed" of software? By CPU cycles or rather by time you need to complete some task?
  • What about software that is actually getting faster and leaner (measured by CPU cycles and MB of RAM) and more responsive with new versions, like Firefox 3.0 compared with 2.0, Linux 2.6 compared with 2.4, Ruby 1.9 compared to 1.8. Or completely new software that is order of magnitude faster then old stuff (like Google's V8 Engine)? Doesn't it negate that law?
+5  A: 

Yes I think it is true.

How do I measure the speed of software? Well time to solve tasks is a relevant indicator. For me as a user of software I do not care whether there are 2 or 16 cores in my machine. I want my OS to boot fast, my programs to start fast and I absolutely do not want to wait for simple things like opening files to be done. A software has to just feel fast. So .. when booting Windows Vista there is no fast software I am watching.

Software / Frameworks often improve their performance. That's great but these are mostly minor changes. The exception proves the rule :)

In my opinion it is all about feeling. And it feels like computers have been faster years ago. Of course I couldn't run the current games and software on those old machines. But they were just faster :)

eteubert
For example Ruby 1.9 is 2.5x faster then 1.8. Minor change?
vartec
As I said: The exception proves the rule. I know there are also great efforts in optimizing JavaScript engines at the moment but I very often don't feel the speedups. I buy a new computer, start it and think "Hum, seems slower than the previous one". May be just a matter of wrong OS.
eteubert
My Ubuntu box does go from power on to login in half the time of my XP box. Admittedily the Ubuntu box has a slightly faster CPU but has less RAM.
Skizz
+4  A: 

In general, the law holds true. As you have stated, there are exceptions "that prove the rule". My brother recently installed Win3.1 on his 2GHz+ PC and it boots in a blink of an eye.

I guess there are many reasons why the law holds:

  1. Many programmers entering the profession now have never had to consider limited speed / resourced systems, so they never really think about the performance of their code.
  2. There's generally a higher importance on getting the code written for deadlines and performance tuning usually comes last after bug fixing / new features.

I find FF's lack of immediate splash dialog annoying as it takes a while for the main window to appear after starting the application and I'm never sure if the click 'worked'. OO also suffers from this.

There are a few articles on the web about changing the perception of speed of software without changing the actual speed.

Skizz

EDIT:

In addition to the above points, an example of the low importance given to efficency is this site, or rather, most of the other Q&A sites. This site has always been developed to be fast and responsive and it shows. Compare this to the other sites out there - I've found phpBB based sites are flexible but slow. Google is another example of putting speed high up in importance (it even tells you how long the search took) - compare with other search engines that were around when google started (now, they're all fast thanks to google).

It takes a lot of effort, skill and experience to make fast code which is something I found many programmers lack.

Skizz
"Win3.1 on his 2GHz+ PC" there is no doubt about it. But AFAIR, Win3.1 on it's contemporary PC did take a lot longer to start. And there was no suspend-to-RAM, nor hibernation.
vartec
And nowadays, Win3.1 won't recognize half the hardware, so you have essentially an expensive brick that boots fast.
Adriano Varoli Piazza
The Win3.1 install did work - I think he wanted to play some old game that wouldn't work under XP.
Skizz
Google is clear example where raw hardware power is cheaper and better than "optimal" solution. See for example http://www.databasecolumn.com/2008/01/mapreduce-a-major-step-back.html
vartec
+2  A: 

My machine is getting slower and clunkier every day. I attribute most of the slowdown to running antivirus. When I want to speed up, I find that disabling the antivirus works wonders, although I am apprehensive like being in a seedy brothel.

dar7yl
Antivirus scanning can be disk-intensive. Get a SSD, or turn off the antivirus. It's not hard to keep your personal Windows machine pristine. Antivirus is for everybody else.
guns
+2  A: 

I think that Wirth's law was largely caused by Moore's law - if your code ran slow, you'd just disregard since soon enough, it would run fast enough anyway. Performance didn't matter.

Now that Moore's law has changed direction (more cores rather than faster CPUs), computers don't actually get much faster, so I'd expect performance to become a more important factor in software development (until a really good concurrent programming paradigm hits the mainstream, anyway). There's a limit to how slow software can be while still being useful, y'know.

gustafc
+3  A: 

One of the issues of slow software is a result of most developers using very high end machines with multicore CPUs and loads to ram as their primary workstation. As a result they don't notice performance issues easily.

Part of their daily activity should be running their code on slower more manstream hardware that the expected clients will be using. This will show the real world performance and allow them to focus on improving bottlenecks. Or even running within a VM with limited resources can aid in this review.

Faster hardware shouldn't be an excuse for creating slow sloppy code, however it is.

schooner
+3  A: 

It is wrong. Correct is

Software is getting slower at the same rate as hardware becomes faster.

The reason is that this is mostly determined by human patience, which stays the same. It also neglects to mention that the software of today does more than 30 years ago, even if we ignore eye candy.

starblue
Nah, IMO human patience is most likely fading away. Did you ever have to use your favorite bloated IDE at grandma's computer (say, in a week-end emergency) ? Frustrating, at the very least.
Leonel
+1  A: 

Yes it holds true. You have given some prominent examples to counter the thesis, but bear in mind that these examples are developed by a big community of quite knowledgeable people, who are more or less aware of good practices in programming.

People working with the kernel are aware of different CPU's architectures, multicore issues, cache lines, etc. There is an interesting ongoing discussion about inclusion of hardware performance counters support in the mainline kernel. It is interesting from the 'political' point of view, as there is a conflict between the kernel people and people having much experience in performance monitoring.

People developing Firefox understand that the browser should be "lightweight" and fast in order to be popular. And to some extend they manage to do a good job.

New versions of software are supposed to be run on faster hardware in order to have the same user experience. But whether the price is just? How can we asses whether the functionality was added in the efficient way?

But coming back to the main subject, many of the people after finishing their studies are not aware of the issues related to performance, concurrency (or even worse, they do not care). For quite a long time Moore law was providing a stable performance boost. Thus people wrote mediocre code and nobody even noticed that there was something wrong with inefficient algorithms, data-structures or more low-level things.

Then some limitations came into play (thermal efficiency for example) and it is no longer possible to get 'easy' speed for few bucks. People who just depend on hardware performance improvements might get a cold shower. On the other hand, people who have in-depth knowledge of algorithms, data structures, concurrency issues (quite difficult to recruit these...) will continue to write good applications and their value on the job market will increase.

The Wirth law should not only be interpreted literally, it is also about poor code bloat, violating the keep-it-simple-stupid rule and people who waste the opportunity to use the 'faster' hardware.

Also if you happen to work in the area of HPC then these issues become quite obvious.

Anonymous
Exactly in area of HPC issues do not became obvious. From economical point of view it's a lot cheaper to add nodes to cluster, than have to pay same 20-50 times more to have the code run 30% faster. Skilled mathematicians and software developers don't come cheap.
vartec
Then it depends on the work culture and policies - in some places there is more focus on putting new hardware, in others it is on squeezing out last bit of performance out of it ;)
Anonymous
If you have qualified programmers working for 5¢ an hour...
vartec
Or if your target hardware is fixed, for example if it's already in the customer's hands and they're not going to upgrade.
Crashworks
@Crashwork -- in HPC? In HPC customer will rather upgrade his blades to new CPU for $50K and have immediate performance boost of 50%, then pay developers $500K and have 25% performance boost in 6 months.
vartec
+3  A: 

From my own experience, I have to disagree with Wirth's law.

When I first approached a computer (in the 80'), the time for displaying a small still picture was perceptible. Today my computer can decode and display 1080p AVCHD movies in realtime.

Another indicator is the frames per second of video games. Not long ago it used to be around 15fps. Today 30fps to 60 fps are not uncommon.

mouviciel
+1. You did pick two areas where developers are still shying fancy new languages and VMs in favor of C++ and assembly, though.
MSalters
FPs is a rather bad indicator (it can be changed in the same game on the same hardware by playing with option menus), besides running an early 3D game on modern hardware tends to generate FPS well beyond use (e.g. 100-200)
DrHazzard
@MSalters graphic engines -- yes, but lot of games have logic written in Lua.
vartec
+2  A: 

Yes, software nowadays may be slower or faster, but you're not comparing like with like. The software now has so much more capability, and a lot more is expected of it.

Lets take as an example: Powerpoint. If I created a slideshow with Powerpoint from the early nineties, I can have a slideshow with pretty colours fairly easily, nice text etc. Now, its a slideshow with moving graphics, fancy transitions, nice images.

The point is, yes, software is slower, but it does more.

The same holds true of the people who use the software. back in the 70s, to create a presentation you had to create your own transparencies, maybe even using a pen :-). Now, if you did the same thing, you'd be laughed out of the room. It takes the same time, but the quality is higher.

This (in my opinion) is why computers don't give you gains in productivity, because you spend the same amount of time doing 'the job'. But if you use todays software, your results looks more professional, you gain in quality of work.

MatthieuF
Shouldn't that be able to load when necessary though, and not at startup? I wonder if the next big programming language (tm) will incorporate syntax that make this extremly easy.
Daniel W
+7  A: 

It's not that software becomes slower, it's that its complexity increases.

We now build upon many levels of abstraction.
When was the last time people on SO coded in assembly language?
Most never have and never will.

Renaud Bompuis
+2  A: 

Skizz and Dazmogan have it right.

  • On the one hand, when programmers try to make their software take as few cycles as possible, they succeed, and it is blindingly fast.

  • On the other hand, when they don't, which is most of the time, their interest in "Galloping Generality" uses up every available cycle and then some.

I do a lot of performance tuning. (My method of choice is random halting.) In nearly every case, the reason for the slowness is over-design of class and data structure.

Oddly enough, the reason usually given for excessively event-driven and redundant data structure is "efficiency".

As Bompuis says, we build upon many layers of abstraction. That is exactly the problem.

Mike Dunlavey
+1  A: 

In some cases it is not true: the frame rate of games and the display/playing of multimedia content is far superior today than it was even a few years ago.

In several aggravatingly common cases, the law holds very, very true. When opening the "My Computer" window in Vista to see your drives and devices takes 10-15 seconds, it feels like we are going backward. I really don't want to start any controversy here but it was that as well as the huge difference in time needed to open Photoshop that drove me off of the Windows platform and on to the Mac. The point is that this slowdown in common tasks is serious enough to make me jump way out of my former comfort zone to get away from it.

Mark Brittingham
I wish people would stop going on about Vista, my machine has little lag opening my computer (2~3 seconds on occasion). As for Photoshop I have a machine where it opens at a reasonable speed (it is a large program), and a 3 year old machine where it takes ages, but I wouldn't call it a common task!
DrHazzard
+2  A: 

Quoting from a UX study:

The technological advancements of 21 years have placed modern PCs in a completely different league of varied capacities. But the “User Experience” has not changed much in two decades. Due to bloated code that has to incorporate hundreds of functions that average users don’t even know exist, let alone ever utilize, the software companies have weighed down our PCs to effectively neutralize their vast speed advantages.

Detailed comparison of UX on a vintage Mac and a modern Dual Core: http://hubpages.com/hub/_86_Mac_Plus_Vs_07_AMD_DualCore_You_Wont_Believe_Who_Wins

Lakshman Prasad