views:

270

answers:

7

I've learnt a dynamic programming algorithm to find the "cheapest" path from A to B. Each sub path has an associated cost.

Each corner is calculated using
D(i,j).value = min( (D(i-1,j).value + D(i,j).x), (D(i,j-1).value + D(i,j).y))
Where x and y are the costs of the path on the left of the node and below the node.

I'm now having trouble figuring out the amount of possible cheapest paths in the best possible time.

Any suggestions?

http://www.freeimagehosting.net/uploads/f6e0884a2d.png

A: 

Check out Hillclimbing and A-star algorithms on wikipedia.

JC
A: 

A* is best first and will probably work pretty well for you.

Mr-sk
A* is best suited to map transversal where some nodes are blocked, and most nodes have a common cost. Here is a good reference: http://www.policyalmanac.org/games/aStarTutorial.htm
Chris S
+7  A: 

You're looking for Dijkstra's algorithm. It is a graph search algorithm that solves the single-source shortest path problem for a graph with nonnegative edge path costs, producing a shortest path tree.

Chris S
Do you think that's the simplest way of doing things?
Dave
It's a classic. There are a great many graph spanning algorithms, but you should know about Dijkstra's if you're looking at that kind of thing.
Joe
Yes, Dijkstra's is a brilliant algorithm that you really should learn if you are trying to solve something like this. It is extremely fast because each node is only visited at most once.
Justin Peel
I looked up Dijkstra's algorithm here: http://www.youtube.com/watch?v=8Ls1RqHCOPw. I can see how that would let me find a single quickest path, but I'm looking to find all of the alternative cheapest paths. How can I do this?
Dave
Dijkstra's algorithm may be a classic, but it isn't the solution to the OP's problem. It find one shortest path. It cannot be modified in any way to get the number of shortest paths.
MAK
@MAK, you are correct that Dijkstra's doesn't solve the "current" question, but did solve the question as it was originally worded...
Chris S
A: 

Try looking for "shortest path" algorithms, but there is a catch.

Many shortest-path searches use estimated distance to goal as a heuristic, which often has to satisfy certain conditions. A*, for example, uses a distance-to-goal estimation function, and is guaranteed to find the shortest path if it never underestimates the distance to the goal. Since you're looking for the path with lowest arbitrary cost, you won't have that sort of heuristic easily available. Also, you know the length of the path, so algorithms like iterative deepening aren't going to be useful.

Any deterministic algorithm, like Dijkstra's algorithm, should work fine.

David Thornley
+1  A: 

The dynamic programming approach you describe is called DAG shortest path. It only works on directed acyclic graphs (that is, graphs with no cycles). Its asymptotic runtime is O(V+E) (where V and E are the number of vertices and edges, respectively), which is faster than Dijkstra's algorithm.

I'm not sure, but are you asking how to calculate the number of paths which has length equal to the shortest path?

You can do that by maintaining predecessor information when you calculate the length of the shortest path. When you move to node j, store all pairs (i,j) such that going from i to j is part of a shortest path. One way to implement this is to add two fields, D(x,y).optimalUp and D(x,y).optimalRight (datatype boolean), indicating if you get an optimal solution if you entered (x,y) by going up or right, respectively. For example, set D(x,y).optimalUp to true if going up from (x,y-1) results in the cheapest path.

Then you can do a second pass to count the number of cheapest paths, using dynamic programming. Add another field, say D(x,y).count (integer) which holds the number of ways to go from A to (x,y) in the cheapest way. There are two ways to reach (x,y): Via (x-1,y) and (x,y-1). The count of (x,y-1) should only be added to the count of (x,y) if it is possible to achieve the cheapest path to (x,y) by going via (x,y-1). The same principle holds for (x-1,y).

We then get the recurrence:

D(x,y).count =
    D(x,y).optimalUp   ? D(x,y-1).count : 0
  + D(x,y).optimalDown ? D(x-1,y).count : 0

( ?: is the conditional operator in C/C++/Java.)

Judging from your picture, it seems your graph is a grid. Beware that going only up or right need not result in the shortest path. In the graph below, the shortest path from A to B is 8, but you cannot achieve better than 12 if you go only right and up.

x -1-- x -1-- B
|      |      |
1      9      9
|      |      |
x -1-- x -1-- x
|      |      |
9      9      1
|      |      |
A -1-- x -1-- x

Since this is marked as homework, I won't be providing more details than this for now. This should nevertheless be a nudge in the right direction (if I have understood your problem correctly). Feel free to ask follow-up questions, though.

stubbscroll
Hi, You did understand the question correctly, although I neglected to mention that you can only go right and up. I didn't understand what your predecesessor list should contain. Also, you mentioned using a dynamic programming algorithm to count the number of paths with lengths equal to the shortest path- that's exactly what I'm having difficulty figuring out how to do. Any clarification/help would be appreciated.
Dave
Added more detailed explanation on how to calculate the number of paths.
stubbscroll
+1  A: 

Looks like you are working with a graph where the nodes are points on a 2D grid and each node has a directed edge to the node above it and another to the node to its right.

I don't think just applying Dijkstra's algorithm will work here. It finds one cheapest cost path, and there is really no way to modify it to find all shortest paths.

Since this is such a special graph (i.e. directed and acyclic), you can compute the number of cheapest paths, provided you know what the cost of the cheapest path is, using a simple recurrence. Notice that

number_paths(i,j)=number_of_paths(i-1,j)+number_of_paths(i,j-1)

i.e. the number of paths from any node is the sum of that from the nodes above and to its right. (This omits the base cases where the current node is the destination and where the destination node is unreachable from the current node.)

The only thing needed is to modify this to count only those paths that are cheapest. Now, we already know that the cheapest path has some cost X. We can use it to augment our recurrence so that it only counts the cheapest path. At the start node, we know that the cost of the remaining path is X. From any of the adjacent nodes to the start node (i.e. from the nodes directly above and right of it), the cost is therefore X-e, where e is the cost of the edge between those nodes. This rule applies to any node where the cost to reach the current node is known. Thus we know we have traversed a cheapest path when we reach the destination node and the value of that node is 0 (i.e. we have subtracted all the costs of the edges that form a cheapest path).

So we can just augment our recurrence so that we keep 3 values instead of 2, the coordinates and the remaining cost of the path (essentially converting into the same problem for a 3D graph, where the 3rd dimension is cost). The start node is of the form (x_start,y_start,cost), the destination node is of the form (x_end,y_end,0). Our recurrence would look like:

paths(x,y,cost_left)=
                 0         if x_end,y_end is unreachable from x,y or cost_left<0
                 1         if x==X-end and y==y_end and cost_left==0
                 paths(x-1,y,cost_left-edge(x,y,x-1,y))+paths(x,y-1,cost_left-edge(x,y,x,u-1))   otherwise

The complexity of the algorithm should be O(n*m*C) where n and m are the dimensions of the grid and C is the cost of the cheapest path.

MAK