I'm currently working through an assignment that deals with Big-O and running times. I have this one question presented to me that seems to be very easy, but I'm not sure if I'm doing it correctly. The rest of the problems have been quite difficult, and I feel like I'm overlooking something here.
First, you have these things: Algorithm A, which has a running time of 50n^3. Computer A, which has a speed of 1 millisecond per operation. Computer B, which has a speed of 2 milliseconds per operation. An instance of size 300.
I want to find how long it takes algorithm A to solve this instance on computer A, and how long it takes it on computer B.
What I want to do is sub 300 in for n, so you have 50*(300^2) = 4500000.
Then, multiply that by 1 for the first computer, and by 2 for the second computer.
This feels odd to me, though, because it says the "running time" is 50n^3, not, "the number of operations is 50n^3", so I get the feeling that I'm multiplying time by time, and would end up with units of milliseconds squared, which doesn't seem right at all.
I would like to know if I'm right, and if not, what the question actually means.