views:

945

answers:

4

I'm currently working through an assignment that deals with Big-O and running times. I have this one question presented to me that seems to be very easy, but I'm not sure if I'm doing it correctly. The rest of the problems have been quite difficult, and I feel like I'm overlooking something here.

First, you have these things: Algorithm A, which has a running time of 50n^3. Computer A, which has a speed of 1 millisecond per operation. Computer B, which has a speed of 2 milliseconds per operation. An instance of size 300.

I want to find how long it takes algorithm A to solve this instance on computer A, and how long it takes it on computer B.

What I want to do is sub 300 in for n, so you have 50*(300^2) = 4500000.

Then, multiply that by 1 for the first computer, and by 2 for the second computer.

This feels odd to me, though, because it says the "running time" is 50n^3, not, "the number of operations is 50n^3", so I get the feeling that I'm multiplying time by time, and would end up with units of milliseconds squared, which doesn't seem right at all.

I would like to know if I'm right, and if not, what the question actually means.

+2  A: 

It wouldn't make sense if you had O(n^3) but you are not using big O notation in your example. I.e. if you had O(n^3) you would know the complexity of your algorithm but you would not know the exact number of operations as you said.

Instead it looks as though you are given the exact number of operations that are taken. (Even know it is not explicitly stated). So substituting for n would make sense.

Big O notation describes how the size of the input would effect your running time or memory usage. But with Big O you could not deduce an exact running time even given the speed of each operation.

Putting an explanation of why your answer looks so simple (as I described above) would also be a safe way. But I'm sure even without it you'll get the marks for the question.

Brian R. Bondy
+1  A: 

Well, aside from the pointlessness of figuring out how long something will take this way on most modern computers, though it might make have some meaning in an embedded system, it does look right to me the way you did it.

If the algorithm needs 50n^3 operations to complete something, where n is the number of elements to process, then substituting 300 for n will give you the number of operations to perform, not a time-unit.

So multiply with time per operation and you would get the time needed.

Looks right to me.

Lasse V. Karlsen
A: 

Your 50*n^3 data is called "running time", but that's because the model used for speed evaluations assumes a machine with several basic operations, where each of these takes 1 time unit.

In you case, running the algorithm takes 50*500^3 time units. On computer A each time unit is 1ms, and on computer B 2ms.

Hope this puts some sense into the units,
Asaf.

Asaf R
A: 

pradeep r: dei one place he has 500 pradeep r: other place pradeep r: 300 pradeep r: one place 500^2 pradeep r: other place 500^3 dei this question is mokka da pradeep r: both wwil take order of n^3 time