tags:

views:

143

answers:

2

I am trying to find a good explanation to quickly understand Big O and Theta theory. I always feel an explanation can be given in a million different ways, and I guess I'm seeking that one explanation that finally makes sense. I know this is a n00b question but any help would be appreciated...

+4  A: 

As far as explanations go, StackOverflow folks seem to get a kick out of my telephone book example. You might also like Big O For Eight Year Olds.

John Feminella
+5  A: 

One point of confusion is why something that you might think is O(2n), something that performs two operations for each item in a list, and something else you might think would be O(n) are actually both considered O(n)

The reason for this is that when you start working with huge data sets, the difference between O(2n) and O(n) is not really a huge difference when you consider how that compares to O(n^2) or O(log N). Consider if you had 3 different search algorithms that search a dataset. The dataset contains a million items. The number of operations each algorithm would take:

O(2n) = 2,000,000

O(n) = 1,000,000

O(n^2) = 1,000,000,000,000

The O(2n) is only 2 times as slow as O(n), but O(n^2) is a million times as slow. That's an incredibly massive difference! So Big O notation is really dealing with how an algorithm "scales", or in other words, how well it performs when you consider larger and larger data sets.

A O(n^2) algorithm will perform well for very tiny data sets, but with larger datasets its performance degrades rapidly. O(2n) and O(n) would degrade evenly and gradually, which is closer to what a user would expect if they were working with more data.

For this reason, people don't talk about O(2n), they just talk about O(n) since both would represent linear time(linear in that the number of operations increases evenly and gradually as you add data).

It can be frustrating at first to think that someone's algorithm that performs twice as slow would still be considered O(n), but big O notation is not a measure of relative speed. Big O notation is a measure of how an algorithm scales in respect to the amount of data it is processing.

AaronLS