According to the definition of big O f(n) <= C*g(n)
(which means f(n) = O(g(n)
), it could be deduced that:
f(n) <= C
f(n) <= 2C
I think there are no big differences between these two. What I could come up with is:
f(n) = 1 - 1 / n
f(n) = 2 - 1 / n
C = 1
But what differs this two complexities,since both are constant complexity?
Could you show some real world code to demonstrate the differences between O(1) and O(2).