views:

139

answers:

2

Dynamic programming is, almost by definition, to find a shortest/longest path on an implicit dag. Every DP algorithm just does this.

An Holographic algorithm can be loosely described as something that counts perfect matchings in implicit planar graphs.

So, my question is: are there any other families of algorithms that use well-known algorithms over implicit graphs to achieve a considerable speedup?

+1  A: 

Optimisation problems for which a greedy algorithm always gives an optimal solution have matroid structure. A matroid is a set system, so it's more general than a graph (which is a set system in which the subsets (called edges) all have exactly 2 elements) but it might still be of interest to you.

Holographic algorithms look very interesting and I haven't heard of them before -- will definitely take a look!

j_random_hacker
So a greedy algorithm finds a maximum weight independent set of an implicit matroid (or more generally a special greedoid or accesible set system)! I find all of this very beautiful.
Diego
That sounds about right... :) Also I appreciate your marking my answer "correct" but you might want to just +1 me instead -- marking an answer "correct" so early on might discourage other answerers.
j_random_hacker
You are right, I did that because I can't up-vote you yet (I need more reputation points).
Diego
+2  A: 

Another idea that comes to mind is polyhedral optimisation. Basically, linear programming is an efficient way to solve optimisation problems involving linear inequalities and continuous variables, but restricting the variables to the integers leads to an NP-hard optimisation problem in general. That's unfortunate because a huge variety of problems can be expressed as integer programming problems.

The usual approach has been to exhaustively enumerate all feasible solutions and select the best, possibly using branch and bound to prune parts of the search space which cannot be optimal. That can sometimes work quite well, but the newer polyhedral approach is often dramatically better.

The method that I understand the best is the cutting-plane method, in which a linear relaxation of the problem is solved (using e.g. the simplex algorithm); if the solution happens to be integral, we have the answer for the integer restriction of the problem too. If it's not, then a plane is sought which separates the found non-integral solution from the feasible region of integer solutions. This plane is then added to the problem as a constraint, and the new problem is solved using linear programming again. This continues until an integral solution is produced.

Clearly such a plane must exist; further, a general-purpose algorithm exists for finding them (although it's performance is usually very weak). Researchers have devoted a lot of time to finding better algorithms for solving this "separation problem" for specific problem types. The most famous example is the Travelling Salesman Problem, where the increase in computationally feasible problem size is nothing less than staggering.

The cutting planes approach achieves better performance by limiting the number of constraints simultaneously under consideration; an alternative approach is column generation, in which the number of variables simultaneously being optimised is limited.

Not sure if this really fits your criteria of an implicit graph algorithm, but it's certainly a fascinating non-obvious approach I think!

j_random_hacker