views:

271

answers:

5

Given the following code, what is the complexity of 3. and how would I represent simple algorithms with the following complexities?

O(n²+n)
O(n²+2n)
O(logn) O(nlogn)

var collection = new[] {1,2,3};
var collection2 = new[] {1,2,3};

//1.
//On
foreach(var i in c1)
{
}

//2.
//On²
foreach(var i in c1)
{
  foreach(var j in c1)
  {
  }
}

//3.
//O(nⁿ⁺ᵒ)?
foreach(var i in c1)
{
  foreach(var j in c2)
  {
  }
}
+6  A: 

3 is O(n*m), or O(n^2) if the two collections are the same size.

O(n^2+n) is pointless because n is smaller than n^2. Just write O(n^2).

Most decent comparison sort algorithms run at O(n*log(n)). If you don't know any, look on Wikipedia.

A binary search is O(log(n)).

Mark Byers
Thanks. Given O(n*m) = O(n^2) for sets of the same size, if one set is half the size of the other what is the resulting complexity?
Ben Aston
To clarify, I would say that the algorithm order n-squared if the two collections are expected to be proportional in size. Otherwise the O( n*m ) is more appropriate.
@Ben Aston: You can ignore the coefficients in O(n) notation. If m = n/2 then O(n*m) = O(n^2/2) = O(n^2).
Mark Byers
John, you are wrong. If the sizes are n and n/2 complexity is O(n^2).
shura
+3  A: 

The outer foreach is executed n = |c1| times (where |x| is the size of c1), while the inner foreach is executed m = |c2| times. That's O(n * m) times in total.


how would I represent simple algorithms with the following complexities?

  • O(n²+n)

This is the same as O(n^2). Something that takes O(n^2) time would be drinking a toast with every other person at a party, assuming that there's always exactly two people in a toast, and only one person does the toasting at a time.

  • O(n²+2n)

Same as above; the O(n^2) term dominates. Another example of an O(n^2) effort is planting trees in a square garden of length n, assuming it takes constant time to plant each tree, and that once you plant a tree other trees are excluded from its vicinity.

  • O(logn)

An example of this would be finding a word in a dictionary by repeatedly picking the midpoint of the region of pages you need to search next. (In other words, a binary search.)

  • O(nlogn)

Use the above algorithm, but now you have to find every word in the dictionary.

John Feminella
+1  A: 

Complexity of 3 is O(m*n). There is no complexity O(n2+n) or O(n2+2n). It is just O(n2). This is because n is o(n2).

Example of O(log(n)) is binary search.

Example of O(n*log(n)) is merge sort.

shura
Also, since by definition Big-O is the worst-case scenario if you have something that is O(n^2) then O(n^2+n) isn't any different in the long run b/c the n^2 outweighs it.
Brian T Hannan
+2  A: 

There is no O(n²+n) or O(n^2 + 2n). Leaving aside most of the mathematical foundations of algorithmic complexity, you at least need to know that it is "aymptotic." As N approaches infinity, the value of n^2 + n is dominated by the n ^ 2 term, so that is the asymptotic complexity of n^2 + n.

3's complexity is O(I * J), where I and J are the size of the inputs in c1 and c2.

David Gladfelter
+2  A: 

Truth be told O(n²+n) & O(n²+2n) are the same.

brian