If your constant C is greater than your value of n, then the O(n²) algorithm would be better.
There is always an implied constant in O notation, so yes, it's possible that for sufficiently small n that O(n^2) may be faster than O(n). This would happen if the constant for O(n) was much smaller than that for O(n^2).
C x O(n) < O(n²) is NOT always true, there is a point in n where it reverses the condition.
When C is large and n is small, then C x O(n) > O(n²). However, C is always constant, hence when n scales to a large number, C x O(n) < O(n²).
Therefore, when n is large, O(n) is always better than O(n²).
I think there are two issues here; first what the notation says, second what you would actually measure on real programs
big O is defiend as a limit as n -> infinity so in terms of big O, O(n) < O(n^2) is always true regardless of any finite constants.
as others have pointed out real programs only ever deal with some finite input, so it is quite possible to pick a small enough value for n such that the c*n > n^2 i.e. c > n, however you are strictly speaking no longer dealing with big O