Asymptotically, O(n) is better (cheaper) than O(n^2).
For small values of n, this is a simple algebra problem:
Find 'n' for which 9n+6=2(n^2)+1
: cleaning it up, we get the 2nd grade equation 2(n^2)-9n-5=0
. This yields n=5, which means that, for n=5, both processes would have the same cost:
- 9n+6 => n:5 => 9*5+6 = 45+6 = 51
- 2(n^2)+1 => n:5 => 2(5*5)+1 = 2*25+1 = 50+1 = 51
This means that B is better for n<5, they are equal for n=5, and A is better for n>5. If you expect n to be smaller than 5 in the vast majority of cases, then B may be a better choice, but it will only be relevant if the algorithm is used a lot. If you get implemented it as a function, the minor benefits of B pale against the call overhead, so they won't be unnoticeable.
In summary, unless you are very sure of what you're up to, go ahead with A. In general, you always want the algorithm with better (cheaper) asymptotic cost. Only when you have the same generic order, or reliable knowledge about the input data you may be getting, deeper insight is worth the effort, and even then the best approach is benchmarking both versions with realistic data than theoretical analysis.