views:

153

answers:

5
mystery(int A[1..n], int n) { 
        // pre: n is a power of 2 for i=1..n {
        for i = 1...n {
            A[i] = A[i] + 1; 
        }
        if (n>1) mystery(A, n/2);
    }

}

I think the worst case, it runs in O(n), am I right?


Edit: This is from another old exam (which has answers for us), but the following algorithm runs in O(n*log n) time (according by the answer). Why so? I though these two should only differ by some constants.

void silly (int n)
    if (n>1)
        for (int i=0; i<n; i++)
            output "looping for fun"
        silly(n/2)
        for (int i=0; i<n; i++)
            output "looping for more fun"
        silly(n/2)
        for (int i=0; i<n; i++)
            output "looping for even more fun"
+2  A: 

Yes, it's O(n). You can sanity check it by examining the values:

A(1) = 1 iteration
A(2) = 2 + A(1) = 3
A(4) = 4 + A(2) = 7
A(8) = 8 + A(4) = 15
A(16) = 16 + A(8) = 31
A(32) = 32 + A(16) = 63
...

You see it's scaling linearly where A(n) is basically a linear factor of n.

To answer the comment: no it isn't O(2n) or O(2n-1). There is no O(2n). It's all O(n). See Plain english explanation of Big O.

Edit: your example has a key difference: it calls itself twice not once. Sanity check the results again. Also, this version has a misleading feature in that the loop is repeated three times but three is constant here and as previously stated there is no O(n) so I'm only going to count one of those loops:

A(1) = 1
A(2) = 2 + 2 * A(1) = 4
A(4) = 4 + 2 * A(2) = 12
A(8) = 8 + 2 * A(4) = 32
A(16) = 16 + 2 * A(8) = 80
A(32) = 32 + 2 * A(16) = 192
...

So what is the relationship? Well, if you solve A(n) (for n being a power of 2):

A(n) = n + 2 * A(n/2)
     = n + 2 * (n/2 + 2 * A(n/4))
     = 2n + 4 * A(n/4)
     = 2n + 4 * (n/4 + 2 * A(n/8))
     = 3n + 8 * A(n/8)

You can solve this to the general case:

A(n) = log2(n) * n + A(n/n)
     = log2(n) * n + 1 (since A(1) = 1)

And that's where O(n log n) comes from.

cletus
Is this not O(2n-1)?
astander
@cletus, actually it's all of that O(n) = O(2n) = O(2n-1) = O(.5n). It's the convention not to have a constant, as it is just ignored.
notnoop
A: 

If you were to look at the algorith, it runs

n, n/2, n/4, ... , 2

astander
+1  A: 

Yup. The number of A[i] = A[i] + 1 assignments in a call to mystery(..., N) will be:

N + N/2 + N/4 + ... + 1

Assuming that N is a power of 2, that series evaluates to 2 * N - 1. There will be an equivalent number of increments of i, and log2(N) tests of `N>1', recursive calls to mystery and divisions.

Roughly speaking that is 4 * N + 3 * log2(N) operations (assuming that they have equal weight ... though it doesn't matter).

The limit of that as N tends to infinity is in the range C1 * N to C2 * N operations, for some constants C1 and C2. In other words, the computational complexity is O(N).

Stephen C
+1  A: 

I'm grading a midterm on this subject right now!

Any how, let's call the runtime of this algorithm T(n). The for-loop takes n time, and the function calls itself with the value n/2. So T(n)=T(n/2)+n.

We can solve this recurrence using the Master Theorem and we find that the algorithm takes Theta(n)

Ken Bloom
A: 

It's O(n).

Derivation. You can derive it using the Master Theorem, or the simpler way of expansion:

Assume that the runtime of mystry() is T(n), then it is:

T(n) = n + T(n/2)   # n for the loop, T(n/2) for the recursive call
     = n + (1/2)n + T(n/4)
     = n + (1/2)n + (1/4)n + (1/8)n + ...
     = n (1 + 1/2 + 1/4 + 1/8 + ...)
     = n \Sum^{\inf}_{i=0} (1/2^i)
     = n * (2)
     = 2 n = O(n)
notnoop