views:

448

answers:

5

f(n)=(log(n))^log(n)
g(n)= n/log(n)

f=O(g(n))?

A: 

If Limit[f[x] / g[x], x -> Infinity] = Infinity, then f[x] grows faster than g[x].

Limit[Log[x] ^ Log[x] / (x / Log[x]), x -> Infinity] = + Infinity

So, Log[x] ^ Log[x] grows faster than x / Log[x]

Yassin
If you have Mathematica, you can use Plot[Log[x]^(Log[x])/(x/Log[x]), {x, 0, 100}] to see it.
Yassin
http://www.wolframalpha.com/input/?i=plot+%28log%28x%29%29%5elog%28x%29+and+x%2Flog%28x%29+from+1+to+100
KennyTM
+4  A: 

Take the log of both sides:

log(f(n)) = log(log n) * log n

log(g(n)) = log(n) - log(log(n)) = log(n)(1 - log(log(n))/log(n))

Clearly log(log(n)) dominates (1 - log(log(n))/log(n)), so g is O(f). f is not O(g). Since it's homework, you may need to fill in the details.

It's also fairly easily to get an idea what the answer should be just by trying it with a large number. 1024 is 2^10, so taking n=1024:

f(n) = 10^10

g(n) = 1024/10.

Obviously that's not a proof, but I think we can see who's winning this race.

Steve Jessop
A: 

Mathematica gives the limit of f(n) / g(n) as n tends towards infinity as infinity, which means that f grows faster. This means that g(n) belongs to (=) O(f(n)).

You can use this for example if you don't have Mathematica.

IVlad
+3  A: 

f(n) grows faster than g(n) if and only if f(en) also grows faster than g(en) since exp is strictly increasing to infinity (prove it yourself).

Now f(en) = nn and g(en) = en / n, and you can quote the known results.

KennyTM
This isn't quite right. 1 - 1/x is strictly increasing, but it doesn't follow that if f grows strictly faster than g, then 1 - 1/f grows strictly faster than 1 - 1/g. They might grow at the same big-O rate (unsurprisingly, since 1 - 1/x converges to a constant). So your result is true, but not for the reasons stated.
Steve Jessop
@Steve: (1) "If and only if" (2) You should compare f(1 - 1/x), not 1 - 1/f(x).
KennyTM
OK, so with f(n) = n^2, g(n) = n. 1 - 1/n^2, (1 - 1/n)^2, and 1 - 1/n are all O(1), so they don't grow (strictly) faster even though some of them are strictly greater than others (for n > 1). exp is "better" for our purposes than just strictly increasing, and proves a stronger result.
Steve Jessop
+1, then. You don't even need that exp is strictly increasing, just that it and its inverse both tend to infinity. I think.
Steve Jessop
`exp` is *convex*
Ben Voigt
A: 

f is vastly bigger. by n^loglog(n) -1 . logn

Fakrudeen