tags:

views:

180

answers:

4
  Debug.WriteLine("Timer is high-resolution: {0}", Stopwatch.IsHighResolution);
  Debug.WriteLine("Timer frequency: {0}", Stopwatch.Frequency);

Result:

  Timer is high-resolution: True
  Timer frequency: 2597705

This article (from 2005!) mentions a Frequency of 3579545, a million more than mine. This blog post mentions a Frequency of 3,325,040,000, which is insane.

Why is my Frequency so much comparatively lower? I'm on an i7 920 machine, so shouldn't it be faster?

A: 

Is it a laptop? Often times the clock rate of a machine is scaled depending on the battery/power status.

Nick
It's a desktop. I have power settings set to High Performance mode; the processor clock rate is at default (2.66 GHz).
Sam Pearson
+3  A: 

The frequence depends on the HAL (Hardware abstraction layer). Back in the pentium days, it was common to use the CPU tick (which was based on the CPU clock rate) so you ended up with really high frequency timers.

With multi-processor and multi-core machines, and especially with variable rate CPUs (the CPU clock slows down for low power states) using the CPU tick as the timer becomes difficult and error prone, so the writers of the HAL seem to have chosen to use a slower, but more reliable hardware clock, like the real time clock.

John Knoeller
A: 

The Stopwatch.Frequency value is per second, so your frequency of 2,597,705 means you have more than 2.5 million ticks per second. Exactly how much precision do you need?

As for the variations in frequency, that is a hardware-dependent thing. Some of the most common hardware differences are the number of cores, the frequency of each core, the current power state of your cpu (or cores), whether you have enabled the OS to dynamically adjust the cpu frequency, etc. Your frequency will not always be the same, and depending on what state your cpu is in when you check it, it may be lower or higher, but generally around the same (for you, probably around 2.5 million.)

jrista
+4  A: 

3,579,545 is the magic number. That's the clock frequency at which the original timer chip in the original IBM PC ran at. The number wasn't chosen by accident, it is the frequency of the color burst signal in the TV system used in the US and Japan. The IBM engineers were looking for a cheap crystal to implement the oscillator, nothing was cheaper than the one used in every TV set.

Once IBM clones became widely available, it was still important for their designers to choose the same frequency. At lot of MS-Dos software relied on the timer ticking at that rate. Directly addressing the chip was a common crime.

That changed once Windows came around. A version of Windows 2 was the first to virtualize the timer. In other words, the software wasn't directly talking to the timer chip anymore, an attempt to use the I/O instruction to address the timer chip was trapped and the return value was faked by software.

The Win32 API (Windows NT 3.1 and Windows 95) formalized access to the timer with an API, QueryPerformanceCounter() and QueryPerformanceFrequency(). A kernel level component, the Hardware Adaption Layer, allows the BIOS to pass that frequency. Now it was possible for the hardware designers to really drop the dependency on the exact frequency. That took a long time btw, around 2000 the vast majority of machines still had the legacy rate.

But the never-ending quest to cut costs in PC design put an end to that. Nowadays, the hardware designer just picks any frequency that happens to be readily available in the chipset. 3,325,040,000 would be such a number, it is most probably the CPU clock rate. High frequencies like that are common in cheap designs, especially the ones that have an AMD core. Your number is pretty unusual, maybe your machine wasn't cheap.

Hans Passant
nobugz, thank you for the detailed answer. Could you expand on your last paragraph? It seems to me that a higher frequency would grant increased precision: with a timer freq. of 3.3ghz, I'm at .3-nanosecond resolution, whereas with my freq., I'm at 385ns.
Sam Pearson
Well, you've got a lot less resolution. But your timer is probably a lot more accurate. The 3.3 GHz CPU clock rate is typically only accurate by 10%. I don't know for a fact, it depends how the signal gets generated. Anything running at one megahertz or better is plenty good enough for timing software, the jitter due to threading is a lot worse than that.
Hans Passant