views:

1568

answers:

6

This is a homework question. I'm not expecting an answer, just some guidance, possibly :) I am to show that log(n!) = Θ(n·log(n)).

A hint was given that I should show the upper bound with nn and show the lower bound with (n/2)(n/2). This does not seem all that intuitive to me. Why would that be the case? I can definitely see how to convert nn to n·log(n) [log both sides of an equation], but that's kind of working backwards. What would be the correct approach to tackle this problem? Should I draw the recursion tree? There is nothing recursive about this, so that doesn't seem like a likely approach..

+19  A: 

Remember that log(n!) = log(1) + log(2) + ... + log(n-1) + log(n)

Mick Sharpe
Interesting... Pushing this into a sum, there is an approximation that results in the theta notation itself (source: http://en.wikipedia.org/wiki/Summation) a matter of how that is derived is entirely separate. Nice lead, thanks!
Mark
This is a very nice proof for the upper bound: log(n!) = log(1) + ... + log(n) <= n log(n) => log(n!) = O(n log n). However, for proving the lower bound (and consequently big-tetha), you'll probably need Stirling's Approximation.
Mehrdad Afshari
You don't need Sterling's approximation for a lower bound. log(n!) = log(1) + ... + log(n) >= log(n/2) + ... + log(n) >= n/2 * log(n/2) = Omega(n log n).
Keith Randall
nice trick, I think it works.
ldog
@Keith: Very nice trick. Enjoyed it.
Mehrdad Afshari
@Keith: I don't get it yet. Could you (or someone) expand a few more terms for me in the "..." part of "log(n/2) + ... + log(n)" please? Thanks!
j_random_hacker
@j_random_hacker: `log(n/2) + log(n/2 + 1) + ... + log(n - 1) + log(n)` (larger half of the terms of `log(n!)`). Actually, I just read the question and saw that the clue is stated in the question. Basically, `(n/2)^(n/2) <= n! <= n^n` => `log((n/2)^(n/2))<=log(n!)<=log(n^n)` => `Θ(n/2 * log(n/2))<=log(n!)<=Θ(n*log(n))`
Mehrdad Afshari
@Mehrdad Afshari: Thanks, very helpful! The last part that needed to click was that the `/2` inside the `log()` could be extracted out to a `-log(1/2)` term which then disappears when Omega() is applied -- somehow I was forgetting the basic rules of logs... :/
j_random_hacker
A: 

This might help:

eln(x) = x

and

(lm)n = lm*n
aaa
Actually, that's wrong:1^(m^n) != 1^(m*n)it must be(1^m)^n = 1^(m*n)
Pindatjuh
Errr I mean L instead of 1 in the above comment.
Pindatjuh
+2  A: 

See Stirling's Approximation.

dsimcha
+2  A: 

Helping you further, where Mick Sharpe left you:

It's deriveration is quite simple: see http://en.wikipedia.org/wiki/Logarithm -> Group Theory

log(n!) = log(n * (n-1) * (n-2) * ... * 2 * 1) = log(n) + log(n-1) + ... + log(2) + log(1)

Think of n as infinitly big. What is infinite minus one? or minus two? etc.

log(inf) + log(inf) + log(inf) + ... = inf * log(inf)

And then think of inf as n.

Pindatjuh
+4  A: 

A lot of people are pointing you to stirlings formula to prove this. Although that is a perfectly acceptable route, you don't have to use if if you don't know it, and in fact you can show it in another way.

First off, stack overflow is horrible for math markup so what I'm going to write below is going to look ugly (nudge. nudge. devs of stackoverflow should follow math overflow's example and give some nice latex markup.)

That said, lets get on with things. The upper-bound log(n!) <= n log n is trivial.

The lower bound is not so trivial. I hope by now in your career you are familiar with Taylor series, if you aren't then you should quickly familiarize yourself with them because they are going to be one of your most potent tools for approximating things, or in this case bounds.

Note that

log(n!) = sum[i,1,n] (log i)
        = sum[i,1,n] ( log ( (i/n) * n ) )
        = sum[i,1,n] ( log ( (i/n) ) + log n ) )
        = sum[i,1,n] ( log ( i/n ) ) + sum[i,1,n] (  log n )
        = sum[i,0,n-1] ( log ( 1 - i/n ) + sum[i,1,n] (  log n )

now from taylor series

log( 1 - x ) = - sum[n,1, infinity] x^n/n  for -1 < x < 1

so

log( 1 - i/n ) = - sum[k,1, infinity] (i/n)^k/k
               = - (i/n)( 1 + (i/n)/2 + (i/n)^2/3 + ... )
              >= - ( 1 + (i/n)/2 + (i/n)^2/3 + ... )  since (i/n) < 1
              >= - ( 1 + (i/n) + (i/n)^2 + ... )   
               = - 1/(1-i/n) = n/(i-n)

so

log(n!) = sum[i,0,n-1] ( log ( 1 - i/n ) ) + sum[i,1,n] (  log n )
        = sum[i,0,n-1] ( log ( 1 - i/n ) ) + n log n
       >= sum[i,0,n-1] ( n/(i-n) ) + n log n
        = n sum[i,0,n-1] ( 1/(i-n) ) + n log n
        = - n sum[i,1,n] ( 1/i ) + n log n
        = n ( log n - sum[i,1,n] ( 1/i ) )

let H_n = sum[i,1,n] ( 1/i ), then there exists a positive 0 < c < 1 such that H_n <= c log n to see this note that

log(x) is bounded below by some a > 0 for all x > 2

and so the integral must be strictly greater than the sum (this is a general result for strictly positive functions that are bounded below by some a) :

H_n = sum[i,1,n] ( 1/i ) < integral[x,1,n] ( 1/x ) = log n  for n >= 2

thus there must exist a constant such that H_n <= c log n with 0 < c < 1.

Then we have that

log(n!) >= n ( log n - H_n )
        >= n ( log n - c log n )
         = ( 1 - c ) n log n

proving the result.

ldog
A: 

BigO is abut asymptotically. For qsort you can say that it can sort in n! order or less. That is true. But as I said BigO is about asymptotically.

In that case you should prove that some function lie between two functions and that there is lower and upper limit, and this helps you prove properties.

ralu