views:

7585

answers:

8

I understand Big-O notation, but I don't know how to calculate it for many functions. In particular, I've been trying to figure out the computational complexity of the naive version of the Fibonacci sequence:

int Fib(int n)
{
    if (n <= 1)
        return 1;
    else
        return Fib(n - 1) + Fib(n - 2);
}

What is the computational complexity of the Fibonnaci sequence and how is it calculated?

+1  A: 

Good answers here.

Greg
+10  A: 

Just ask yourself how many statements need to execute for F(n) to complete.

For F(1), the answer is 1 (the first part of the conditional).

For F(n), the answer is F(n-1) + F(n-2).

So what function satisfies these rules? Try a^n:

a^n == a^(n-1) + a^(n-2)

Divide through by a^(n-2):

a^2 == a + 1

Solve for a and you get (1+sqrt(5))/2, otherwise known as the golden ratio.

So it takes exponential time.

Jason Cohen
Proof by induction. Nice. +1
Andrew Rollings
+29  A: 

You model the time function to calculate Fib(n) as sum of time to calculate Fib(n-1) plus the time to calculate Fib(n-2) plus the time to add them together (O(1)).

T(n<=1) = O(1)

T(n) = T(n-1) + T(n-2) + O(1)

You solve this recurrence relation (using generating functions, for instance) and you'll end up with the answer.

Alternatively, you can draw the recursion tree, which will have depth n and intuitively figure out that this function is asymptotically O(2n). You can then prove your conjecture by induction.

Base: n = 1 is obvious

Assume T(n-1) = O(2n-1), therefore

T(n) = T(n-1) + T(n-2) + O(1) which is equal to

T(n) = O(2n-1) + O(2n-2) + O(1) = O(2n)

However, as noted in a comment, this is not the tight bound. An interesting fact about this function is that the T(n) is asymptotically the same as the value of Fib(n) since both are defined as

f(n) = f(n-1) + f(n-2).

The leaves of the recursion tree will always return 1. The value of Fib(n) is sum of all values returned by the leaves in the recursion tree which is equal to the count of leaves. Since each leaf will take O(1) to compute, T(n) is equal to Fib(n) x O(1). Consequently, the tight bound for this function is the Fibonacci sequence itself (~θ(1.6n)). You can find out this tight bound by using generating functions as I'd mentioned above.

By the way, if you run it on your computer, you won't feel the exponential time as it'll throw a nice StackOverflowException pretty soon! (This is primarily meant as a joke, as the depth of the recursion tree is at most, n, which is probably not a large number to trigger a StackOverflowException. However, if you call the function with a large enough n you'll end up with StackOverflowException before you feel the exponential time. Try it with fib(int.MaxValue) and you'll see.)

Mehrdad Afshari
Also Proof by induction. Nice. +1
Andrew Rollings
Although the bound is not tight.
Captain Segfault
@Captain Segfault: Yeah. I clarified the answer. You'd get the tight bound using GF method as I had written above.
Mehrdad Afshari
Itake your StackOverflowException as a joke. The exponential time is perceivable quite easily with rather small values for n.
David Rodríguez - dribeas
+2  A: 

It is bounded on the lower end by 2^(n/2) and on the upper end by 2^n (as noted in other comments). And an interesting fact of that recursive implementation is that it has a tight asymptotic bound of Fib(n) itself. These facts can be summarized:

T(n) = Ω(2^(n/2))  (lower bound)
T(n) = O(2^n)   (upper bound)
T(n) = Θ(Fib(n)) (tight bound)

The tight bound can be reduced further using its closed form if you like.

Dave L.
+6  A: 

There's a very nice discussion of this specific problem over at MIT. On page 5, they make the point that, if you assume that an addition takes one computational unit, the time required to compute Fib(N) is very closely related to the result of Fib(N).

As a result, you can skip directly to the very close approximation of the Fibonacci series:

Fib(N) = (1/sqrt(5)) * 1.618^(N+1) (approximately)

and say, therefore, that the worst case performance of the naive algorithm is

O((1/sqrt(5)) * 1.618^(N+1)) = O(1.618^(N+1))

PS: There is a discussion of the closed form expression of the Nth Fibonacci number over at Wikipedia if you'd like more information.

Bob Cross
Nice reference to the MIT stuff.
Charlie Martin
+2  A: 

This performs way better:

unsigned int Fib(unsigned int n)
{
    // first Fibonaci number is Fib(0)
    // second one is Fib(1) and so on

    // unsigned int m;  // m + current_n = original_n
    unsigned int a = 1; // Fib(m)
    unsigned int b = 0; // Fib(m-1)
    unsigned int c = 0; // Fib(m-2)

    while (n--)
    {
        c = b;
        b = a;
        a = b+c;
    }

    return a;
}
Eduardo León
That doesn't explain how to calculate big O notation for the function.
Matt Ellen
@Matt Ellen: It's simple. This function is `O(n)`, since the number of iterations of the `while` loop is equal to `n`, and there is no other loop or recursive call. (Proving `6*n + 4` is `O(n)` is trivial if you know the definition of the `O` notation.)
Eduardo León
Determining Big-O is entirely academic, unless you have an alternative implementation (and Big-O) to compare it to. Eduardo supplies the counterpoint here.
Jason
@Eduardo: I agree that this can be a much better way of implementing it, but the notion of Big-O comments on the algorithm used for a problem. For instance, the expected runtime of bogosort (http://en.wikipedia.org/wiki/Bogosort) is `O(2^n)`, but there are other ways to sort a list more efficiently (for instance selection sort). I think Juliet is asking about the order of the algorithm she provided, which she calls the "naive version."
starghost
@starghost: Agreed. Since her algorithm just adds 1s, the order of her algorithm is `O(Fib(n))` itself.
Eduardo León
A: 

http://www.ics.uci.edu/~eppstein/161/960109.html

time(n) = 3F(n) - 2

A: 

I've implemented a faster fibonacci number calculator which computes in log2(n) computations

http://www.optionsbender.com/technologybending/algorithms/sublinearalgorithmforfibonacciseries

reddy