views:

224

answers:

2

The question:

Assume that cache memory is ten times faster than DRAM memory, that DRAM is 100,000 times faster than magnetic disk, and that flash memory is 1,000 times faster than disk. If it takes 2 microseconds while reading from cache memory, how long does it take to read the same file from DRAM, disk, and flash memory?

Does this require only simple algebra, or are there any complex computations needed?

A: 

I don't know if this is programming, but since access time hasn't been mentioned I guess you just need to start X times faster than Y.

I would have thought that...

Disk Memory (1) * Flash Memory (1000) * DRAM (

Okay, I just stopped midway because the question makes nooo sense

You have to :

  • first assume a value which is critical to provide the answer,
  • then in my mind it provides wrong comparisons? (Not as in wrong numbers, just wrong as in they make no sense since there isn't a relationship between them)
  • Finally, the end doesn't make sense. Did you type it wrong?
Laykes
Ahmet Alp Balkan
Did you typo in it then? It doesn't make sense. I think you have incorrectly typed the final sentance.
Laykes
Holy shit. I think the sentence is lost because of mistakenly tapping to the touchpad while writing on keyboard. Final sentence is : "Find how much does it take to read the same file from DRAM, disk and flash memory if it takes 2 microseconds while reading from cache memory."
Ahmet Alp Balkan
:) See. I knew I was right, so thanks for the person who -1 me for trying to answer an incomplete question.
Laykes
+1  A: 

Given time equivalencies:

    1,000 flash = disk
  100,000 dram  = disk  
1,000,000 cache = disk  (substituted from 10 cache = dram)

Now given cache = 2µs:

disk  = 1,000,000 * 2µs =  2s
dram  = 2s / 100,000    = 20µs
flash = 2s /   1,000    =  2ms
Roger Pate