tags:

views:

989

answers:

7

how we can work on timer deals with milliseconds (0.001) how we could divide second as we want ?? how we could we deal with the second itself ???

A: 

The motherboard has a clock that ticks. Every tick represents a unit of time.

To be more precise, the clock is usually a quartz crystal that oscilates at a given frequency; some common CPU clock frequencies are 33.33 and 40 MHz.

Esteban Araya
The real time clock frequency is usually completely unrelated to the CPU clock frequency. Most often something like 32.768 KHz.
Steve Fallows
+33  A: 

http://computer.howstuffworks.com/question319.htm

In your computer (as well as other gadgets), the battery powers a chip called the Real Time Clock (RTC) chip. The RTC is essentially a quartz watch that runs all the time, whether or not the computer has power. The battery powers this clock. When the computer boots up, part of the process is to query the RTC to get the correct time and date. A little quartz clock like this might run for five to seven years off of a small battery. Then it is time to replace the battery.

動靜能量
+1 for the bit about about querying the RTC on boot.
Esteban Araya
+8  A: 

Your PC will have a hardware clock, powered by a battery so that it keeps ticking even while the computer is switched off. The PC knows how fast its clock runs, so it can determine when a second goes by.

Initially, the PC doesn't know what time it is (i.e. it just starts counting from zero), so it must be told what the current time is - this can be set in the BIOS settings and is stored in the CMOS, or can be obtained via the Internet (e.g. by synchronizing with the clocks at NIST).

Jeffrey Kemp
+2  A: 

To answer the main question, the BIOS clock has a battery on your motherboard, like Jian's answer says. That keeps time when the machine is off.

To answer what I think your second question is, you can get the second from the millisecond value by doing an integer division by 1000, like so:

second = (int) (milliseconds / 1000);

If you're asking how we're able to get the time with that accuracy, look at Esteban's answer... the quartz crystal vibrates at a certain time period, say 0.00001 seconds. We just make a circuit that counts the vibrations. When we have reached 100000 vibrations, we declare that a second has passed and update the clock.

We can get any accuracy by counting the vibrations this way... any accuracy thats greater than the period of vibration of the crystal we're using.

Sudhir Jonathan
first, what is "Esteban's answer"?second, i misunderstood me i ask about that how i make the timer generate its event each 0.001 i don't ask how to use it i ask about how we make it how can i freely divide the second as i want ?
mavric
well, if you need portions of the second, then use the 1 second = 1000 milliseconds ratio... so if you want 1/2 second, its 500 milliseconds, if you want 1/5 seconds - 200 milliseconds, 1/10 seconds - 100 milliseconds and so on...
Sudhir Jonathan
+3  A: 

Computers know the time because, like you, they have a digital watch they look at from time to time.

When you get a new computer or move to a new country you can set that watch, or your computer can ask the internet what the time is, which helps to stop it form running slow, or fast.

As a user of the computer, you can ask the current time, or you can ask the computer to act as an alarm clock. Some computers can even turn themselves on at a particular time, to back themselves up, or wake you up with a favourite tune.

Internally, the computer is able to tell the time in milliseconds, microseconds or sometimes even nanoseconds. However, this is not entirely accurate, and two computers next to each other would have different ideas about the time in nanoseconds. But it can still be useful.

The computer can set an alarm for a few milliseconds in the future, and commonly does this so it knows when to stop thinking about your e-mail program and spend some time thinking about your web browser. Then it sets another alarm so it knows to go back to your e-mail a few milliseconds later.

As a programmer you can use this facility too, for example you could set a time limit on a level in a game, using a 'timer'. Or you could use a timer to tell when you should put the next frame of the animation on the display - perhaps 25 time a second (ie every 40 milliseconds).

Alex Brown
I don't have a digital watch.
Nosredna
mavric does. You should be ashamed.
Alex Brown
This is a very elegant way to explain computer time-keeping to a novice. +1
Chris McCall
+5  A: 

Some recap, and some more info:

1) The computer reads the Real-Time-Clock during boot-up, and uses that to set it's internal clock

2) From then on, the computer uses it's CPU clock only - it does not re-read the RTC (normally).

3) The computer's internal clock is subject to drift - due to thermal instability, power fluctuations, inaccuracies in finding an exact divisor for seconds, interrupt latency, cosmic rays, and the phase of the moon.

4) The magnitude of the clock drift could be in the order of seconds per day (tens or hundreds of seconds per month).

5) Most computers are capable of connecting to a time server (over the internet) to periodically reset their clock.

6) Using a time server can increase the accuracy to within tens of milliseconds (normally). My computer updates every 15 minutes.

dar7yl
A: 

Absolute time is archaically measured using a 32-bit counter of seconds from 1970. This can cause the "2038 problem," where it simply overflows. Hence the 64-bit time APIs used on modern Windows and Unix platforms (this includes BSD-based MacOS).

Quite often a PC user is interested in time intervals rather than the absolute time since a profound event took place. A common implementation of a computer has things called timers that allow just that to happen. These timers might even run when the PC isn't with purpose of polling hardware for wake-up status, switching sleep modes or coming out of sleep. Intel's processor docs go into incredible detail about these.

GregC