What's the complexity of a recursive program to find factorial of a number n? My hunch is that it might be O(n)
Yes, the standard recursive implementation of the factorial function is O(n)
. It's often used as an example to illustrate recursion, along with the Fibonacci function.
That said, you can also do some crazy stuff with recursion and make it perform a lot better or a lot worse.
If you take multiplication as O(1)
, then yes, O(N)
is correct. However, note that multiplying two numbers of arbitrary length x
is not O(1)
on finite hardware -- as x
tends to infinity, the time needed for multiplication grows (e.g. if you use Karatsuba multiplication, it's O(x ** 1.585)
). (You can theoretically do better for sufficiently huge numbers with Schönhage-Strassen, but I confess I have no real world experience with that one;-). x
, the "length" or "number of digits" (in whatever base, doesn't matter for big-O anyway;-) of N, grows with O(log N)
, of course.
If you mean to limit your question to factorials of numbers short enough to be multiplied in O(1)
, then there's no way N
can "tend to infinity" and therefore big-O notation is inappropriate.
Assuming you're talking about the most naive factorial algorithm ever:
factorial (n):
if (n = 0) then return 1
otherwise return n * factorial(n-1)
Yes, the algorithm is linear, running in O(n) time. This is the case because it executes once every time it decrements the value n
, and it decrements the value n
until it reaches 0
, meaning the function is called recursively n
times. This is assuming, of course, that both decrementation and multiplication are constant operations.
Of course, if you implement factorial some other way (for example, using addition recursively instead of multiplication), you can end up with a much more time-complex algorithm. I wouldn't advise using such an algorithm, though.