views:

381

answers:

3

Hi

From time to time you hear stories that are meant to illustrate how good someone is at something, and sometimes you hear about the guy how is so into code optimization that he optimizes his delay loop.

Since this really sounds like it's a strange thing to do as it's much better to start a "timer interrupt" instead of a optimized buzy wait, and nobody ever tend to tells you the name of the optimizing hacker.

That has left me to wonder if it is a urban myth or is it real?

What do you say, reality or fiction?

Thanks Johan


Update: It sounds like ShuggyCoUk was on to something, wonder if we can find a example.

Update: Just a little clarification, this question is about the "delay" function it self and how that is implemented, not how and where you call it. And what that purpose was, and how that system became better.

Update: It's no myth, those guys seems to exist

Thanks ShuggyCoUk

+1  A: 

The version I've always heard is of a group of hardware programmers who developed a special instruction that optimised the idle (not busy) loop of their operating system. This is mentioned in Kernighan & Pike's book The Practice Of Programming, but even there they admit it may be an Urban Myth.

anon
Extra kudos for the mention of TPOP - an excellent book.
Jonathan Leffler
A: 

I've heard stories of programmers who intentionally put in long delay loops early in projects and removed them later as "optimizations" to impress management. Never figured out if the stories were apocryphal or not.

Nosredna
But that is another kind of "trick", that is to keep the project managers of their back...
Johan
But that one is funny thou ;)
Johan
Yeah. I wondered if the stories are related.
Nosredna
No, not related. But the version I heard about those delays was that the "team" decreased them a little bit when management wanted to see results (and no other result could be shown). So the managers always got nice benchmarks from time to time that they could cut and paste them into powerpoint and all by doing this they got some space to breath for a week or two....
Johan
It must be apocryphal. I heard the story before there was such a thing as PowerPoint. :-)
Nosredna
http://thedailywtf.com/Articles/The-Speedup-Loop.aspx, maybe?
Michael Myers
Maybe it was widespread practice. :-)
Nosredna
Don't think so, but it is a good story for the coffee break ;)
Johan
Much like the episode of Star Trek: TNG with Scotty. "You didn't tell him how long it would really take?" said with shock and at least a touch of horror.
David Thornley
+3  A: 

This has more than a kernel of truth about it...

Spin wait can be much better than a signal based interrupt or a yield.

  • You trade some throughput for much reduced latency.
    • Often this is vitally important within an OS itself.
  • You allow yourself the freedom to do operations not possible within an interrupt handler
    • memory allocation for example.
  • You can get considerably finer grained control of the interval waited since you can essentially measure the cycle count.

However spin waits are tricky to get right.

  • If you can you should use use proper idle instructions which:
    • can power down parts of the core, improving power usage/heat dissipation and even allowing other cores to go faster.
    • In Hyper Thread based CPUs you allow the other logical thread to use the full CPU pipeline while you spin.
    • an instruction you might think was a no-op could cause the CPU to execute them out of order via the super scalar execution units. The resulting code may get unforeseen out of order artefacts which force the CPU to apply a great deal of effort in terms of stalls and memory barriers which are unwanted.

This is why you let someone else write the spin wait loop for you in most cases..

As to some citations of using PAUSE for spin waits:

  • PostGresSQL
  • Linux
    • See also the note that this is better on non P4 as well due to reducing power
ShuggyCoUk