I am learning algorithm analysis. I am having trouble understanding the difference between O, Ω, and Θ.
The way they're defined is as follows:
f(n) = O(g(n))meansc · g(n)is an upper bound onf(n). Thus there exists some constantcsuch thatf(n)is always ≤c · g(n), for large enoughn(i.e.,n ≥ n0for some constantn0).f(n) = Ω(g(n))meansc · g(n)is a lower bound onf(n). Thus there exists some constantcsuch thatf(n)is always ≥c · g(n), for alln ≥ n0.f(n) = Θ(g(n))meansc1 · g(n)is an upper bound onf(n)andc2 · g(n)is a lower bound onf(n), for alln ≥ n0. Thus there exist constantsc1andc2such thatf(n) ≤ c1 ·g(n)andf(n) ≥ c2 ·g(n). This means thatg(n)provides a nice, tight bound onf(n).
The way I have understood this is:
O(f(n))gives worst case complexity of given function/algorithm.Ω(f(n))gives best case complexity of given function/algorithm.Θ(f(n))gives average case complexity of given function/algorithm.
Please correct me if I am wrong. If it is the case, time complexity of each algorithm must be expressed in all three notations. But I observed that it's expressed as either O, Ω, or Θ; why not all three?