views:

254

answers:

8

According to the definition of big O f(n) <= C*g(n)(which means f(n) = O(g(n)), it could be deduced that:

f(n) <= C
f(n) <= 2C

I think there are no big differences between these two. What I could come up with is:

f(n) = 1 - 1 / n
f(n) = 2 - 1 / n
C = 1

But what differs this two complexities,since both are constant complexity?

Could you show some real world code to demonstrate the differences between O(1) and O(2).

+8  A: 

O(1) and O(2) are the same, as is any O(constant value).

The point being that neither rely on some function of N.

Mitch Wheat
+13  A: 

There is no difference between O(1) and O(2). Algorithms classifying as O(1) are O(2) and vice versa. In fact, O(c1) is O(c2) for any positive constants c1 and c2.

O(c) where c is a positive constants simply means that the runtime is bounded independent of the input or problem size. From this it is clear (informally) that O(1) and O(2) are equal.

Formally, consider a function f in O(1). Then there is a constant c such that f(n) <= c * 1 for all n. Let d = c / 2. Then f(n) <= c = (c / 2) * 2 = d * 2 which shows that f is O(2). Similarly if g is O(2) there is a constant c such that g(n) <= c * 2 for all n. Let d = 2 * c. Then g(n) <= c * 2 = d = d * 1 which shows that g is O(1). Therefore O(1) = O(2).

Jason
A: 

You typically do not write O(2) or O(2n) but O(1) and O(n) instead. The difference is in actual speed, of course - 5s vs 10s for instance. I found your question a bit confusing.

Hamish Grubijan
No, big-Oh does not tell you runtime and `O(2)` most certainly does not mean twice as slow as `O(1)`.
Jason
Well, he's right that an algorithm that performs `2n` operations will take twice as long as an operation that takes `n` operations (assuming all operations take the same amount of time). Of course "twice as long" when we're talking about milliseconds isn't such a big deal, but when you *square* that number, it can get pretty big, pretty fast.
Mark
Fine, I know that and did not explain it well. However, I was trying to guess what the asker wanted. A pedantic answer is not always the best answer.
Hamish Grubijan
@Mark: First, you're assuming that all operations execute in the same amount of time. More importantly, he did not say an algorithm that performs `2n` operations. He compared `O(n)` to `O(2n)`. There are hidden constants buried in these definitions which mean that they do not directly translate to "number of operations." Further, when we talk about big-Oh we are often talking about average case or worst case, for example, so that again you can not directly translate `O(n)` into "performs `n` operations."
Jason
A: 

There is no difference between O(1) and O(2).

The order-of notation is unique up to a constant. O(f(x)) means that there is some constant k such that the time would be less than kf(x).

If something is O(2), then there is some constant k that the program takes less than 2k. Therefore, there's another constant, k' = 2k that works for O(1).

Chip Uni
A: 

There is not difference between O(1) and O(2). In fact you would not use this notation. It is more like O(N) or O(n^2), O(log(N)) etc. This is just a indication of the order of magnitude of the algorithm. Put in other words, O(1) would be constant in time. O(N) would be linear in the number of items (N), O(n^2) would be exponential in time, etc.

Jim Kramer
O(n^2) is not exponential; it is quadratic in time. O(2^n) is an example of exponential time.
Chip Uni
+2  A: 

Maybe they meant that both algorithms execute in constant time regardless of input size (usually denoted as N), but one of them is twice as fast. But it's an abuse of the big-O notation.

Seva Alekseyev
A: 

Big-O notation is generally used for asymptotic analysis of algorithm complexity, ie analysis of how the algorithm performs as n increases toward infinity. The highest order term in n of the function f(n) will be the dominant part of the function in this case.

As such, lower order terms in n are usually dropped from the function when expressed in big-O (for example, f(n)=2n^2+4 will result in O(n^2) asymptotic big-O complexity).

In the case of the highest term being constant and not dependent on n, then all these constants are effectively the same asymptotically speaking and are usually simply reduced to O(1).

Hence, O(2) would be considered equivalent to O(1).

fd
+3  A: 

There is no difference.

In the graph below, the red line represents O(n) and the green curve represents O(n2).

As you can see by the red line, the 2 and the 1 become insignificant as x increases (the green curve grows at a much faster rate). This is what Big-O notation is trying to capture; constants are relatively meaningless.

Mark