If algorithm A has complexity O(n) and algorithm B has complexity o(n^2), what, if anything, can we say about the relationship between A and B? Note: the complexity of A is expressed using big-Oh, and the complexity of B is expressed using little-Oh.
As wikipedia expressively puts it,
"f(x) is little-o of g(x)" ... means that g(x) grows much faster than f(x)
(my ellipsis and emphasis) -- putting it in the somber way that "mathematician wannabes" seem to require, the limit of f(x)/g(x) tends to 0 at the limit when x tends to infinity.
So, there's not all that much we can say: A grows no faster than n, B grows much slower than n squared, it might even be the same algorithm you're talking about;-) -- just as if you were expressing the same constraints verbally ("A is no worse than linear, B is better than quadratic").
Indeed, anything O(n) will be o(n squared), though not vice versa; for example, x to the power of K for some constant K is going to be O(n) for any K <= 1, but it will be o(n squared) for any K < 2. Other functions that are o(n squared) but not O(N) include the commonly-encountered n log n.