A: 

If your constant C is greater than your value of n, then the O(n²) algorithm would be better.

Chris J
+1  A: 

There is always an implied constant in O notation, so yes, it's possible that for sufficiently small n that O(n^2) may be faster than O(n). This would happen if the constant for O(n) was much smaller than that for O(n^2).

Paul R
You mean the other way around: O(n^2) faster than O(n)
Andreas Brinck
@Andreas: thanks - good catch - now fixed
Paul R
A: 

C x O(n) < O(n²) is NOT always true, there is a point in n where it reverses the condition.

When C is large and n is small, then C x O(n) > O(n²). However, C is always constant, hence when n scales to a large number, C x O(n) < O(n²).

Therefore, when n is large, O(n) is always better than O(n²).

StartClass0830
+2  A: 

I think there are two issues here; first what the notation says, second what you would actually measure on real programs

  1. big O is defiend as a limit as n -> infinity so in terms of big O, O(n) < O(n^2) is always true regardless of any finite constants.

  2. as others have pointed out real programs only ever deal with some finite input, so it is quite possible to pick a small enough value for n such that the c*n > n^2 i.e. c > n, however you are strictly speaking no longer dealing with big O

jk
hmm there seems to be a rendering error that is repeating words in my post?
jk
@jk It's rendering fine for me. Also, +1 for the clear answer. :)
Daniel Stutzbach
its just the word possible in point 2 now
jk
@jk: What do you mean when say you are no longer dealing with big O?
jasonline