tags:

views:

527

answers:

6

Hi,

I'm wondering what the difference between O(n^2) and O(n.log(n)) is?

Thanks~

+1  A: 

You'll need to be a bit more specific about what you are asking, but in this case O(n log(n)) is faster

Marek Karbarz
+12  A: 

n^2 grows in complexity more quickly.

Complexity Chart

Joe Koberg
where did you get this image? its a nice illustration
Neil Foley
Blue line is y = 0.1 x ^ 2. I'm trying to figure out what the red line is, specifically what base the logarithm is. Shouldn't it be base 2? The general sense is correct, but the magnitude may be misleading.
Ewan Todd
The log base should be irrelevant in the long run.
Joe Koberg
This doesn't really convey the asymthotic behaviour of O(n^2) and O(nlogn), since if the green curve is exactly n^2 then you could have another function which is still O(n^2) yet look very similar to the red curve. -1 for glossing over this critical aspects of Big-O understanding.
harms
The question asked the difference between NlogN and N^2. Not between N^2 and (0.01 * N^2)... Which Big-O notation is SUPPOSED to "gloss over".
Joe Koberg
@Neil: Just made up the graph in Excel!
Joe Koberg
One important behavior is that n < nlogn < n^2 for n > log base. Even with crazy coefficients this behavior expresses itself eventually. Here, nlogn looks like it has been artificially squashed again. Otherwise it would be over 100 at n=100.
Ewan Todd
I figured out why. The X axis was not properly labeled.... it was Excel counting the number of items (rather than using the X value at that item). I will post a new graphic.
Joe Koberg
+1  A: 

http://www.cs.georgetown.edu/~alspaugh/cls/2009b-072/algorithmAnalysis.html

n log(n) grows significantly slower

rotard
+1  A: 

Big O calculates an upper limit of running time relative to the size of a data set (n).

An O(n*log(n)) is not always faster than a O(n^2) algorithm, but when considering the worst case it probably is. A O(n^2)-algorithm takes ~4 times longer when you duplicate the working set (worst case), for O(n*log(n))-algorithm it's less. The bigger your data set is the more it usually gets faster using an O(n*log(n))-algorithm.

EDIT: Thanks to 'harms', I'll correct a wrong statement in my first answer: I told that when considering the worst case O(n^2) would always be slower than O(n*log(n)), that's wrong since both are except for a constant factor!

Sample: Say we have the worst case and our data set has size 100.

  • O(n^2) --> 100*100 = 10000
  • O(n*log(n)) --> 100*2 = 200 (using log_10)

The problem is that both can be multiplied by a constant factor, say we multiply c to the latter one. The result will be:

  • O(n^2) --> 100*100 = 10000
  • O(n*log(n)) --> 100*2*c = 200*c (using log_10)

So for c > 50 we get O(n*log(n)) > O(n^2), for n=100.

I have to update my statement: For every problem, when considering the worst case, a O(n*log(n)) algorithm will be quicker than a O(n^2) algorithm for arbitrarily big data sets.

The reason is: The choice of c is arbitrary but constant. If you increase the data set large enough it will dominate the effect of every constant choice of c and when discussing two algorithms the cs for both are constant!

Johannes Weiß
"An O(n*log(n)) is not always faster than a O(n^)", this is correct. But the follow-up "but when considering the worst case it is" is unfortunately incorrect. Are you confusing worst-case analysis with Big-O analysis (and maybe best-case with Big-Omega)? Those should not be confused, they have in fact nothing to do with each other.
harms
You are right since it is always 'except for an constant factor'! I will update my answer. Thanks
Johannes Weiß
A: 

Algorithms that run in O(nlog(n)) time are faster than those that run in O(n^2).

Big-O defines the upper-bound on performance. As the size of the data set grows (n) the length of time it takes to perform the task. You might be interested in the iTunes U algorithms course from MIT.

acrosman
-1 for the first statement, +1 for the second. All O(n^2) functions are not guaranteed to be slower than all O(nlogn) functions for all values of n.
Annabelle
True. I left out that relationship assumes a sufficiently large value of n.
acrosman
you also left out the fact that O() is an upper bound.
Kugel
Second paragraph opens: "Big-O defines the upper-bound on performance"
acrosman
A: 

"Big Oh" notation gives an estimated upper bound on the growth in the running time of an algorithm. If an algorithm is supposed to be O(n^2), in a naive way, it says that for n=1, it takes a max. time 1 units, for n=2 it takes max. time 4 units and so on. Similarly for O(n log(n)), it says the grown will be such that it obeys the upper bound of O(n log(n)). (If I am more than naive here, please correct me in a comment).

I hope that helps.

Amit