f(n)=(log(n))^log(n)
g(n)= n/log(n)
f=O(g(n))?
f(n)=(log(n))^log(n)
g(n)= n/log(n)
f=O(g(n))?
If Limit[f[x] / g[x], x -> Infinity] = Infinity, then f[x] grows faster than g[x].
Limit[Log[x] ^ Log[x] / (x / Log[x]), x -> Infinity] = + Infinity
So, Log[x] ^ Log[x] grows faster than x / Log[x]
Take the log of both sides:
log(f(n)) = log(log n) * log n
log(g(n)) = log(n) - log(log(n)) = log(n)(1 - log(log(n))/log(n))
Clearly log(log(n)) dominates (1 - log(log(n))/log(n)), so g is O(f). f is not O(g). Since it's homework, you may need to fill in the details.
It's also fairly easily to get an idea what the answer should be just by trying it with a large number. 1024 is 2^10, so taking n=1024:
f(n) = 10^10
g(n) = 1024/10.
Obviously that's not a proof, but I think we can see who's winning this race.
f(n) grows faster than g(n) if and only if f(en) also grows faster than g(en) since exp is strictly increasing to infinity (prove it yourself).
Now f(en) = nn and g(en) = en / n, and you can quote the known results.