In the article "Teach Yourself Programming in Ten Years" Peter Norvig (Director of Research, Google) gives the following approximate timings for various operations on a typical 1GHz PC back in 2001:
- execute single instruction = 1 nanosec = (1/1,000,000,000) sec
- fetch word from L1 cache memory = 2 nanosec
- fetch word from main memory = 10 nanosec
- fetch word from consecutive disk location = 200 nanosec
- fetch word from new disk location (seek) = 8,000,000 nanosec = 8 millisec
What would the corresponding timings be for your definition of a typical PC desktop anno 2010?