views:

1185

answers:

4

With the reference of this answer, what is Theta (tight bound)?

Omega is lower bound, quite understood, the minimum time an algorithm may take. And we know Big-O is for upper bound, means the maximum time an algorithm may take. But I have no idea regarding the Theta.

+4  A: 

Big O is the upper bound, while Omega is the lower bound. Theta requires both Big O and Omega, so that's why it's referred to as a tight bound (it must be both the upper and lower bound).

For example, an algorithm taking Omega(n log n) takes at least n log n time but has no upper limit. An algorithm taking Theta(n log n) is far preferential since it takes AT LEAST n log n (Omega n log n) and NO MORE THAN n log n (Big O n log n).

Chris Bunch
Oh.. Now the term "tight bound" appearing quite self-explaining to me. Thanks Chris. Stupid me, perhaps I was expecting some complex idea. :)
Adeel Ansari
Yea, there's a lot of fancy notation thrown around but it's not too complex once you get it under your belt.
Chris Bunch
+3  A: 
Charlie Martin
Now reading the family of Bachmann-Landau notations. Thanks Charlie, I went there before, but returned without proceeding to its length.
Adeel Ansari
Hey, it's good to get a refresh on doctoral comps every so often.
Charlie Martin
+1  A: 

The phrases minimum time and maximum time are a bit misleading here. When we talk about big O notations, it's not the actual time we are interested in, it is how the time increases when our input size gets bigger. And it's usually the average or worst case time we are talking about, not best case, which usually is not meaningful in solving our problems.

Using the array search in the accepted answer to the other question as an example. The time it takes to find a particular number in list of size n is n/2 * some_constant in average. If you treat it as a function f(n) = n/2*some_constant, it increases no faster than g(n) = n, in the sense as given by Charlie. Also, it increases no slower than g(n) either. Hence, g(n) is actually both an upper bound and a lower bound of f(n) in Big-O notation, so the complexity of linear search is exactly n, meaning that it is Theta(n).

In this regard, the explanation in the accepted answer to the other question is not entirely correct, which claims that O(n) is upper bound because the algorithm can run in constant time for some inputs (this is the best case I mentioned above, which is not really what we want to know about the running time).

PolyThinker
So, can we say that Ω is the best case, and O is the worst?. . .. and should we replace the terms as best case, and worst case, respectively?
Adeel Ansari
Best case is O(1) for any problem?
Zach Langley
@Adeel, no, Theta and O can both refer to either average case or worst case. @Zach, well, not exactly. Thanks for pointing that out.
PolyThinker
+8  A: 
Bill the Lizard