The interrupt service routine (ISR) for a device transfers 4 bytes of data from the device on each device interrupt. On each interrupt, the ISR executes 90 instructions with each instruction taking 2 clock cycles to execute. The CPU takes 20 clock cycles to respond to an interrupt request before the ISR starts to execute instructions. Calculate the maximum data rate, in bits per second, that can be input from this device, if the CPU clock frequency is 100MHz.
Any help on how to solve will be appreciated.
What I'm thinking - 90 instructions x 2 cycles = 180 20 cycles delay = 200 cycles per one interrupt
so in 100mhz = 100million cycles = 100million/200 = 500,000 cycles each with 4 bytes so 2million bytes or 16million bits
I think its right but im not 100% sure can anyone confirm? cheers/