views:

571

answers:

8

Possible Duplicate:
What is Big O notation? Do you use it?

Hi all,

fairly basic scalability notation question.

I recently recieved a comment on a post that my python ordered-list implimentation "but beware that your 'ordered set' implementation is O(N) for insertions"

Which is great to know, but I'm not sure what this means.

I've seen notation such as n(o) o(N), N(o-1) or N(o*o)

what does the above notation refer to?

+10  A: 

It's called Big O Notation: http://en.wikipedia.org/wiki/Big_O_notation

So saying that insertion is O(n) means that you have to walk through the whole list (or half of it -- big O notation ignores constant factors) to perform the insertion.

This looks like a nice introduction: http://rob-bell.net/2009/06/a-beginners-guide-to-big-o-notation/

dreeves
cool thanks this is a good start
Fire Crow
*O* is the upper bound. Meaning the complexity of an insertion grows at most as fast as *n*.
Gumbo
+2  A: 

This may help:

http://en.wikipedia.org/wiki/Big%5FO%5Fnotation#Orders%5Fof%5Fcommon%5Ffunctions

O(n): Finding an item in an unsorted list or a malformed tree (worst case); adding two n-digit numbers

Good luck!

Ian P
+2  A: 

Wikipedia explains it far better than I can, however it means that if your list size is N, it takes at max N loops/iterations to insert an item. (In effect, you have to iterate over the whole list)

If you want a better understanding, there is a free book from Berkeley that goes more in-depth about the notation.

Yacoby
awesome, that book is great
Fire Crow
"it takes at max N loops/iterations to insert an item"hmm.. N loops on problem size N = O(N * N) = O(N^2), unless I misunderstand what you were trying to say.
BioBuckyBall
Not exactly "at **max** N". It can be 2*N, or 3*N. That would still be O(N).
Martinho Fernandes
+6  A: 

Specifically O(n) means that if there's 2x as many items in the list, it'll takes No more than twice as long, if there's 50 times as many it'll take No more than 50 times as long. See the wikipedia article dreeves pointed out for more details

Edit (in bold above): It was pointed out that Big-O does represent the upper bound, so if there's twice as many elements in the list, insertion will take at most twice as long, and if there's 50 times as many elements, it would take at most 50 times as long.

If it was additionally Ω(n) (Big Omega of n) then it would take at least twice as long for a list that is twice as big. If your implementation is both O(n) and Ω(n), meaning that it'll take both at least and at most twice as long for a list twice as big, then it can be said to be Θ(n) (Big Theta of n), meaning that it'll take exactly twice as long if there are twice as many elements.

According to Wikipedia (and personal experience, being guilty of it myself) Big-O is often used where Big-Theta is what is meant. It would be technically correct to call your function O(n^n^n^n) because all Big-O says is that your function is no slower than that, but no one would actually say that other than to prove a point because it's not very useful and misleading information, despite it being technically accurate.

Davy8
This isn't the case. What you described is the lower bound, not the upper bound.
San Jacinto
Thanks, corrected it (I think... it seems like the constant value cancels out in the specific case of being on the order of 'n' because the constant should be there when n=1 so if the time for n=1 is say 50ms, then that means the constant value is 50, so for n=10, it should take 50*10ms, which is still 10 times as long and the constant cancels out and can be safely discarded.
Davy8
I think this is probably the best answer to this question, now.
San Jacinto
+2  A: 

Short answer: It means that the processing time is in linear relation to the size of input. E.g if the size of input (length of list) triples, the processing time (roughly) triples. And if it increases thousandfold, the processing time also increases in the same magnitude.

Long answer: See the links provided by Ian P and dreeves

Kimvais
A: 

It refers to how complex your program is, i.e., how many operations it takes to actually solve a problem. O(n) means that each operation takes the same number of steps as the items in your list, which for insertion, is very slow. Likewise, if you have O(n^2) means that any operation takes "n" squared number of steps to accomplish, and so on... The "O" is for Order of Magnitude, and the the expression in the parentheses is always related to the number of items being manipulated in the procedure.

Michael
+1 for what the "O" stands for
Fire Crow
+5  A: 

The comment was referring to the Big-O Notation.

Briefly:

  1. O(1) means in constant time - independent of the number of items.
  2. O(N) means in proportion to the number of items.
  3. O(log N) means a time proportional to log(N)

Basically any 'O' notation means an operation will take time up to a maximum of k*f(N)
where:

k is a constant multiplier

f() is a function that depends on N

quamrana
thanks, that fills in the gaps :)
Fire Crow
how does it fill gaps? the definition of big-oh is wrong (even informally), and everybody else addressed items similar to points 1, 2, and 3. Glad you found an answer you like, but it baffles me.
San Jacinto
+2  A: 

O(n) is Big O Notation and refers to the complexity of a given algorithm. n refers to the size of the input, in your case it's the number of items in your list.

O(n) means that your algorithm will take on the order of n operations to insert an item. e.g. looping through the list once (or a constant number of times such as twice or only looping through half).

O(1) means it takes a constant time, that it is not dependent on how many items are in the list.

O(n^2) means that for every insert, it takes n*n operations. i.e. 1 operation for 1 item, 4 operations for 2 items, 9 operations for 3 items. As you can see, O(n^2) algorithms become inefficient for handling large number of items.

For lists O(n) is not bad for insertion, but not the quickest. Also note that O(n/2) is considered as being the same as O(n) because they both grow at the same rate with n.

Colin Gislason
"n operations to insert an item. e.g. looping through the list once." If you loop twice, it's still O(n) and you perform 2*n operations. What you said about O(n/2) is true for O(2*n) as well.
Martinho Fernandes
Good point. I've updated my answer a little.
Colin Gislason