I would break this up into several easy, reusable lemmas:
Lemma 1: For a positive constant k, f = O(g) if and only if f = O(k g).
Proof: Suppose f = O(g). Then there exist constants c and N such that |f(n)| < c |g(n)| for n > N.
Thus |f(n)| < (c/k) (k |g(n)| ) for n > N and constant (c/k), so f = O (k g). The converse is trivially similar.
Lemma 2: If h is a positive monotonically increasing function and f and g are positive for sufficiently large n, then f = O(g) if and only if h(f) = O( h(g) ).
Proof: Suppose f = O(g). Then there exist constants c and N such that |f(n)| < c |g(n)| for n > N. Since f and g are positive for n > M, f(n) < c g(n) for n > max(N, M). Since h is monotonically increasing, h(f(n)) < c h(g(n)) for n > max(N, M), and lastly |h(f(n))| < c |h(g(n))| for n > max(N, M) since h is positive. Thus h(f) = O(h(g)).
The converse follows similarly; the key fact being that if h is monotonically increasing, then h(a) < h(b) => a < b.
Lemma 3: If h is an invertible monotonically increasing function, then f = O(g) if and only if f(h) + O(g(h)).
Proof: Suppose f = O(g). Then there exist constants c, N such that |f(n)| < c |g(n)| for n > N. Thus |f(h(n))| < c |g(h(n))| for h(n) > N. Since h(n) is invertible and monotonically increasing, h(n) > N whenever n > h^-1(N). Thus h^-1(N) is the new constant we need, and f(h(n)) = O(g(h(n)).
The converse follows similarly, using g's inverse.
Lemma 4: If h(n) is nonzero for n > M, f = O(g) if and only if f(n)h(n) = O(g(n)h(n)).
Proof: If f = O(g), then for constants c, N, |f(n)| < c |g(n)| for n > N. Since |h(n)| is positive for n > M, |f(n)h(n)| < c |g(n)h(n)| for n > max(N, M) and so f(n)h(n) = O(g(n)h(n)).
The converse follows similarly by using 1/h(n).
Lemma 5a: log n = O(n).
Proof: Let f = log n, g = n. Then f' = 1/n and g' = 1, so for n > 1, g increases more quickly than f. Moreover g(1) = 1 > 0 = f(1), so |f(n)| < |g(n)| for n > 1 and f = O(g).
Lemma 5b: n != O(log n).
Proof: Suppose otherwise for contradiction, and let f = n and g = log n. Then for some constants c, N, |n| < c |log n| for n > N.
Let d = max(2, 2c, sqrt(N+1) ). By the calculation in lemma 5a, since d > 2 > 1, log d < d. Thus
|f(2d^2)| = 2d^2 > 2d(log d) >= d log d + d log 2 = d (log 2d) > 2c log 2d > c log (2d^2) = c g(2d^2) = c |g(2d^2)| for 2d^2 > N, a contradiction. Thus f != O(g).
So now we can assemble the answer to the question you originally asked.
Step 1:
log n = O(n^a)
n^a != O(log n)
For any positive constant a.
Proof: log n = O(n) by Lemma 5a. Thus log n = 1/a log n^a = O(1/a n^a) = O(n^a) by Lemmas 3 (for h(n) = n^a), 4, and 1. The second fact follows similarly by using Lemma 5b.
Step 2:
log^5 n = O(n^0.01)
n^0.01 != O(log^5 n)
Proof: log n = O(n^0.002) by step 1. Then by Lemma 2 (with h(n) = n^5), log^5 n = O( (n^0.002)^5 ) = O(n^0.01). The second fact follows similarly.
Final answer:
n log^5 n = O(n^1.01)
n^1.01 != O(n log^5 n)
In other words,
f = O(g)
f != 0(g)
f != Omega(g)
Proof: Apply Lemma 4 (using h(n) = n) to step 2.
With practice these rules become "obvious" and second nature. and unless your test requires that you prove your answer you'll find yourself whipping through these kinds of big-O problems.