Hi all,
I'm just trying to understand how in little o notation this is true:
f(n)/g(n) as n goes to infinity = 0?
Can someone explain that to me?
I do get the idea that f(n) = o(g(n)) means that f(n) grows no faster then cg(n) for all constants c > 0.
I just don't get the bit in bold above.