views:

160

answers:

6

So when someone asks you to give a O(n) or a O(nlogn) algorithm to compute something, how do you know what to answer? It seems the only way of being able to answer this sort of question is knowing the time complexities of various algorithms beforehand, rather than thinking up something on the spot. Am I correct in assuming this?

+9  A: 

simple cases are:

Iteration along the list - O(n)

Single binary or tree operation - O(log n) (which means insertion of n elements into the tree is n log n)

Hashtable operation - O(1)

NP complete problem (here you need some experience to develop an intuition) - O(N!) in general

for graph with E edges and V vertices the complexity is F(E,V) depending on the nature of the algorithm and density of the graph. E+V for good DFS/BFS.

Almost every algorithm can be partitioned into the set of above (or similar) small blocks. When blocks are combined recursively, like A-includes-B, the complexity multiplies, when blocks follow each other, like A-then-B, the complexity is max(complexity A, complexity B)

bobah
Another really common situation: `O(n^2)` often means you will be doing some operation on every pair of elements.
Larry Wang
There's also the case when sorting is needed as a preprocessing step, adding a complexity of O(n log n).
kunigami
@kunigami - I wouldn't include sorting in the list of simple cases. It is not necessarily O(n log n), and if sorting is required as a pre-processing step it doesn't "add" a complexity, the complexity of a group of serial blocks is determined by the most complex element in the serial chain
bobah
@bobah - By "add" I meant "sum". If your algorithm is O(n^2), with the "sorting block" it is O(n log n + n^2) = O(n^2).
kunigami
+2  A: 

You are right, you should know the time complexities for different algorithms to know this. You should know the time complexities for sorting, for finding items in a dictionary, in a hash table, union find, flow graphs, DFS, BFS, minimum spanning trees, etc. These are the basics.

Introduction to Algorithms should have you well covered.

Amir Rachum
+1  A: 
  1. Think very hard.
  2. Come up with an algorithm for the problem
  3. Analyze it to determine its time complexity (big-O).
  4. If the time complexity is what you were asked to produce, you are done.
  5. Else goto 1.

Seriously, though, you need to know the complexity of algorithms for common problems, such as iteration, searching, sorting, hash table look-up, etc. For example, it is very helpful to know that a simple sorting algorithm like Bubble Sort is O(n^2), and that Quick Sort is O(n log n) in the average case. It is also important to be able to quickly analyze an algorithm to determine its complexity, and to see how it can be changed to make it faster.

Dima
+1  A: 

That's not too far from the truth. There are a couple of systematic methods, but they are hard work, and even then they tend to get "shortcut" at some point.

Big-O gives an upper bound. If someone asks you for any algorithm and you don't know, you can say O(n^n) and probably be correct (though there are even slower algorithms out there). It's not a tight bound, of course.

The "shortcut" is basically the point of inspiration when you spot a pattern that proves some particular upper bound.

99% of the time, most people just use their intuition to find a good way to look at the algorithm, and then do just enough to prove that bound. For example, instead of looking at the actual flow of execution, it is common to say "each item is processed at most x times, each time taking constant time" (for an O(n) algorithm). You may have missed the fact that e.g. at most log n items are ever processed, but if you really grok what the algorithm is doing, that's reasonably unlikely.

Of course this probably won't get you through an algorithms course.

For the systematic methods - well, there's the "MIT 6.046J / 18.410J Introduction to Algorithms" course videos which can be viewed on YouTube. The lecturer is one of the authors of a very well respected algorithms textbook.

Steve314
On reflection, I think I answered slightly the wrong question. Oh well.
Steve314
A: 

This may be more academic than what you're looking for, but it's worth knowing that for certain problems, it can be proven that any algorithm that solves it won't do better than some lower bound. For instance, a comparison sort won't do better than O(n log n). Another example would be Towers of Hanoi, any solution of which has to be exponential because of the way the problem is constructed.

Some discussion of lower bounds on mathoverflow... linked it for completeness, I don't know how practical reading it would be :D

Anyway, sometimes the right question to ask is, can it be done in the given time?

Beyond that, like everyone else said: have a working knowledge of which algorithms are used to solve which problems. Bobah has good examples. (Just note that hashtable operations aren't always guaranteed to be O(1), but expected to be.)

John C
A: 

The way I do it is a bit more pragmatic.

  1. Get algorithm - Either you already know an algorithm or you find one online or you create it yourself.
  2. Emulate the algorithm in your mind
  3. Increase the size of your set (n) by a gazillion
  4. Emulate it in your mind again, would the time increase by a gazillion? then it's O(n), did it increase by gazillion * log gazillion, then it's O(nlogn)

The only other wy I know of, is if you actually find the theoretical complexity of your problem and perform an asintotic analysis.

krico