views:

355

answers:

5

Hi

I'm trying to work out an algorithm for finding a path across a directed graph. It's not a conventional path and I can't find any references to anything like this being done already.

I want to find the path which has the maximum minimum weight.

I.e. If there are two paths with weights 10->1->10 and 2->2->2 then the second path is considered better than the first because the minimum weight (2) is greater than the minimum weight of the first (1).

If anyone can work out a way to do this, or just point me in the direction of some reference material it would be incredibly useful :)

EDIT:: It seems I forgot to mention that I'm trying to get from a specific vertex to another specific vertex. Quite important point there :/

EDIT2:: As someone below pointed out, I should highlight that edge weights are non negative.

+1  A: 

Use either Prim's or Kruskal's algorithm. Just modify them so they stop when they find out that the vertices you ask about are connected.

EDIT: You ask for maximum minimum, but your example looks like you want minimum maximum. In case of maximum minimum Kruskal's algorithm won't work.

EDIT: The example is okay, my mistake. Only Prim's algorithm will work then.

Jasiu
Ah, starting with the edges ordered in *descending* order of weight? This is a great idea, and it's easy to prove it works. :)
ShreevatsaR
I definitely want maximum minimum. As my example shows:Minimum of (10,1,10) = 1Minimum of (2,2,2) = 2Maximum = 2
Martin
Yes, I must have read something wrong :). So Prim's algorithm is the one that stays in the game.
Jasiu
I know I didn't put it down as a requirement, but presumably prims will not always return the shortest Maximin path (shortest as in least edges traversed)? Obviously I'm simulating some sort of pipe network and if I don't use the shortest path then capacity gets wasted. I'm not too worried about that at the moment to be honest, but if I'm wrong and prims will return the shorest maximin path that's excellent :)
Martin
Kruskal's will also work. (Start with edges sorted in descending order of weight, add one by one till the two vertices are in the same component.)
ShreevatsaR
Martin, you can do it in two phases: First, compute the maximum minimum path. Then you'll have the weight of the edge. Now you can run BFS or Dijkstra in order to find the shortest path, shortest meaning least edges (use BFS then) or lowest total weight (use Dijkstra then). Modify these algorithms so they don't take into account edges lighter then the value computed in the first part.
Jasiu
I'm not convinced that either Prim's or Kruskal's algorithm will work here. Would someone mind updating the answer with a more detailed proof? Thanks!
j_random_hacker
+2  A: 

You could also use the "binary search on the answer" paradigm. That is, do a binary search on the weights, testing for each weight w whether you can find a path in the graph using only edges of weight greater than w.

The largest w for which you can (found through binary search) gives the answer. Note that you only need to check if a path exists, so just an O(|E|) breadth-first/depth-first search, not a shortest-path. So it's O(|E|*log(max W)) in all, comparable to the Dijkstra/Kruskal/Prim's O(|E|log |V|) (and I can't immediately see a proof of those, too).

ShreevatsaR
That's an interesting technique, and it certainly sounds like it should work.
Martin
A: 

This can be solved using a BFS style algorithm, however you need two variations:

  • Instead of marking each node as "visited", you mark it with the minimum weight along the path you took to reach it.

For example, if I and J are neighbors, I has value w1, and the weight of the edge between them is w2, then J=min(w1, w2).

  • If you reach a marked node with value w1, you might need to remark and process it again, if assigning a new value w2 (and w2>w1). This is required to make sure you get the maximum of all minimums.

For example, if I and J are neighbors, I has value w1, J has value w2, and the weight of the edge between them is w3, then if min(w2, w3) > w1 you must remark J and process all it's neighbors again.

DanJ
Why would this terminate in polynomial time? (It looks correct, but doesn't look polynomial-time.)
ShreevatsaR
I imagine that the worst case for this algorithm would be pretty nasty.I can't see any problems with the algorithm except that the "process all it's neighbours again" is a bit fuzzy, wouldn't you have to process the entire path back to the source again?
Martin
A: 

Ok, answering my own question here just to try and get a bit of feedback I had on the tentative solution I worked out before posting here:

Each node stores a "path fragment", this is the entire path to itself so far.

0) set current vertex to the starting vertex
1) Generate all path fragments from this vertex and add them to a priority queue
2) Take the fragment off the top off the priority queue, and set the current vertex to the ending vertex of that path
3) If the current vertex is the target vertex, then return the path
4) goto 1

I'm not sure this will find the best path though, I think the exit condition in step three is a little ambitious. I can't think of a better exit condition though, since this algorithm doesn't close vertices (a vertex can be referenced in as many path fragments as it likes) you can't just wait until all vertices are closed (like Dijkstra's for example)

Martin
Good work...this is just Prim's algorithm as suggested by Jasiu. ;)
Neil G
Wonderful, I do love it when I reinvent the wheel >_<Well I guess I have a fairly good understanding of how his suggestion works then ;)
Martin
A: 

You can still use Dijkstra's!

Instead of using +, use the min() operator.
In addition, you'll want to orient the heap/priority_queue so that the biggest things are on top.

Something like this should work: (i've probably missed some implementation details)

let pq = priority queue of <node, minimum edge>, sorted by min. edge descending
push (start, infinity) on queue
mark start as visited
while !queue.empty:
   current = pq.top()
   pq.pop()
   for all neighbors of current.node:
      if neighbor has not been visited
          pq.decrease_key(neighbor, min(current.weight, edge.weight))

It is guaranteed that whenever you get to a node you followed an optimal path (since you find all possibilities in decreasing order, and you can never improve your path by adding an edge)

The time bounds are the same as Dijkstra's - O(Vlog(E)).

EDIT: oh wait, this is basically what you posted. LOL.

v3
No, Dijkstra's doesn't work here, for reasons similar to why it doesn't work on graphs with negative edge weight.
Christoph
(for a counterexample, see my posting)
Christoph