tags:

views:

170

answers:

1

If algorithm A has complexity O(n) and algorithm B has complexity o(n^2), what, if anything, can we say about the relationship between A and B? Note: the complexity of A is expressed using big-Oh, and the complexity of B is expressed using little-Oh.

+2  A: 

As wikipedia expressively puts it,

"f(x) is little-o of g(x)" ... means that g(x) grows much faster than f(x)

(my ellipsis and emphasis) -- putting it in the somber way that "mathematician wannabes" seem to require, the limit of f(x)/g(x) tends to 0 at the limit when x tends to infinity.

So, there's not all that much we can say: A grows no faster than n, B grows much slower than n squared, it might even be the same algorithm you're talking about;-) -- just as if you were expressing the same constraints verbally ("A is no worse than linear, B is better than quadratic").

Indeed, anything O(n) will be o(n squared), though not vice versa; for example, x to the power of K for some constant K is going to be O(n) for any K <= 1, but it will be o(n squared) for any K < 2. Other functions that are o(n squared) but not O(N) include the commonly-encountered n log n.

Alex Martelli
if "g(x) grows much faster than f(x)" is to be taken literally, then it's logically impossible that f and g are the same... ;)
Tor Valamo
That is a sloppy summary from wikipedia on mathematically precise terms.
FranticPedantic
@Frantic, anybody's of course welcome to click through and read the whole wikipedia entry -- I chose to quote the short, expressive summary because I find it intuitively useful. I don't find your short, pointers-free answer to be **any** use at all, and obviously given my downvote you must think similarly of mine, but surely the fact that at least I *do* point to something objectively useful must give mine the objective edge;-).
Alex Martelli
@FranticPedantic - Feel free to correct the wikipedia article if you know it's wrong. It's helpful that someone who knows it actually writes about it. :)
Tor Valamo