I am preparing for a units quiz and there are two kinds of conversions that have me stumped.
Type one: What is length (in ns) of one cycle on a XXX computer? - In this case, XXX can be some MHz or Ghz, randomly. I am having trouble converting the cyles times. Example:
What is length (in ns) of one cycle on a 50 MegaHertz (MHz) computer?
The second type of conversion I have trouble with: If the average instruction on a XXX computer requires ZZ cycles, how long (in ns) does the average instruction take to execute? - Like the previous case, the XXX will either be some MHz or Ghz. For example:
If the average instruction on a 2.0 GigaHertz (GHz) computer requires 2.0 cycles, how long (in ns) does the average instruction take to execute?
I don't understand what I am doing wrong in these conversions but I keep getting them wrong. Any help would be great!