Lets say we have an average of one page fault every 20,000,000 instructions, a normal instruction takes 2 nanoseconds, and a page fault causes the instruction to take an additional 10 milliseconds. What is the average instruction time, taking page faults into account?
+2
A:
20,000,000 instructions, one of them will page-fault
Therefore, the 20,000,000 instructions will take
(2 nanoseconds * 20,000,000) + 10 milliseconds
get the result (which is the total time for 20,000,000 instructions), and divide it by the number of instructions to get the time-per-instruction.
Aziz
2009-05-25 22:49:34
+1
A:
What is the average instruction time, taking page faults into account?
The average instruction time is the total time, divided by the number of instructions.
So: what's the total time for 20,000,000 instructions?
ChrisW
2009-05-25 22:50:37
A:
If 1 in 20,000,000 instructions causes a page fault then you have a page fault rate of:
Page Fault Rate = (1/20000000)
You can then calculate your average time per instruction:
Average Time = (1 - Page Fault Rate) * 2 ns + (Page Fault Rate * 10 ms)
Comes to 2.5 ns / instruction
Simucal
2009-05-25 23:03:56