views:

102

answers:

2

I heard the following quote once, but forgot to whom it is attributed:

While waiting for a (polynomial-time) algorithm to stop, don't forget that your lifetime is bounded by a polynomial, too.

Could somebody supply a reference?

Another question I have is: Polynomial-time algorithms are great, but what is an example of an algorithm used in practice which requires O(n^101), i.e. is bounded by a polynomial of high degree?

+2  A: 

Well, I haven't seen O(n^101), but it is common to inadvertently create high order algorithms by combining other slightly lower order algorithms. In my experience, the complexity is rarely limited to one variable, it is more often stated in terms of multiple variables, e.g. O(A*log(B)log(C)(D^2)*(E-F))

As an example, I was recently tasked to develop an algorithm to locate all potential sites for a pumped hydro electric power station for a given terrain model with an area of (X) by (Y) meters made up of (N) 3d coordinates. The requirement was to find a flat area of a specified minimum area (A) within a specified maximum horizontal distance (H) and minimum height difference (Z) of another flat are of specified minimum size. Flatness in this context is defined as the volume of material that would have to be moved to level the area to an arbitrary elevation (E), searched at vertical interval (V). Areas were defined as polygons of (S) sides with diameter (D), and search resolution was specified in meters (M). The brute force complexity goes very roughly something like this;

1) Linearly scan for initial flat area = O((X/M) *(Y/M)), this area will have a height range of Z1 to Z2 2) Compute flatness at current position, computing single volume O(S), search for height with minimum volume O(((Z2-Z1)/V)*S) 3) Radially scan for another flat area close to the current flat area, O(D/M), and compute the flatness of the new area O((Z3-Z4)/V)*S)

Combining this we get O((X/M)(Y/M)((Z2-Z1)/V)S(D/M)(H/M)((Z3-Z4)/V)*S) and an obvious need for better approach ;)

The complexity arises in this case because of the necessity to search within searches. e.g. searching for flat spots in the terrain, where the definition of flat spots itself requires a non-trivial search, which in turn may lead to more searching. Sometimes this is unavoidable, leading to undesirable levels of complexity.

Abstraction can often be your enemy here, where one iterative library routine iteratively uses another iterative library routine which in turn is used iteratively in your own code. Bad choices of container classes are a regular pitfall here, particularly when you hit containers containing other containers.

Shane MacLaughlin
+2  A: 

Most likely any algorithm of complexity O(n100) is not practical at all, which explains why such algorithms aren't used in practice.

One recurring family of high-polynomial algorithmic problems is that where you have a large collection of objects (N objects) and you need to find an "optimal" subset of k elements from the collection according to a given arbitrary metric, or to find a subset of k elements with a certain property. The only always applicable solution is to enumerate through all the possible subsets. This leads immediately to O(Nk) complexity (where k is fixed). However a case with k=100 is difficult to imagine and couldn't be solved in practice for most values of N.

antti.huima