views:

1177

answers:

4

UPDATED

After more reading, the solution can be given with the following recurrence relation:

(a) When i = 1 and j = 2, l(i; j) = dist(pi; pj )
(b) When i < j - 1; l(i; j) = l(i; j - 1) + dist(pj-1; pj)
(c) When i = j - 1 and j > 2, min 1<=k<i (l(k; i) + dist(pk; pj ))

This is now starting to make sense, except for part C. How would I go about determining the minimum value k? I suppose it means you can iterate through all possible k values and just store the minimum result of ( l(k,i) + dist(pk,pj)?


Yes, definitely a problem I was studying at school. We are studying bitonic tours for the traveling salesman problem.

Anyway, say I have 5 vertices {0,1,2,3,4}. I know my first step is to sort these in order of increasing x-coordinates. From there, I am a bit confused on how this would be done with dynamic programming.

I am reading that I should scan the list of sorted nodes, and maintain optimal paths for both parts (initial path and the return path). I am confused as to how I will calculate these optimal paths. For instance, how will I know if I should include a given node in the initial path or the return path, since it cannot be in both (except for the endpoints). Thinking back to Fibonacci in dynamic programming, you basically start with your base case and work your way forward. I guess what I am asking is how would I get started with the bitonic traveling salesman problem?

For something like the Fibonacci numbers, a dynamic programming approached is quite clear. However, I don't know if I am just being dense or what but I am quite confused trying to wrap my head around this problem.

Thanks for looking!

NOTE: I am not looking for complete solutions, but at least some good tips to get my started. For example, if this were the Fibonacci problem, one could illustrate how the first few numbers are calculated. Please let me know how I can improve the question as well.

A: 

Impress your professor and use a genetic algorithm. :)

hypoxide
A: 

i've used python-graph before. It has the following algorithms enabled:

* Support for directed, undirected, weighted and non-weighted graphs
* Support for hypergraphs
* Canonical operations
* XML import and export
* DOT-Language output (for usage with Graphviz)
* Random graph generation 

* Accessibility (transitive closure)
* Breadth-first search
* Critical path algorithm
* Cut-vertex and cut-edge identification
* Cycle detection
* Depth-first search
* Heuristic search (A* algorithm)
* Identification of connected components
* Minimum spanning tree (Prim's algorithm)
* Mutual-accessibility (strongly connected components)
* Shortest path search (Dijkstra's algorithm)
* Topological sorting
* Transitive edge identification

sample code:

import graph

gr = graph.graph()

gr.add_nodes([ "A", "B", "C" ])
gr.add_links( "A", "B" )
gr.add_links( "B", "C" )

#st = spanning tree
st, order = gr.breadth_first_search(root="A")
print st
dassouki
I suspect showing that you can find a library implementation isn't going to satisfy the professor. It wouldn't me, at least.
Charlie Martin
+1  A: 

Okay, the key notions in a dynamic programming solution are:

  • you pre-compute smaller problems
  • you have a rule to let you combine smaller problems to find solutions for bigger problems
  • you have a known property of the problems that let's you prove the solution is really optimal under some measure of optimality. (In this case, shortest.)

The essential property of a bitonic tour is that a vertical line in the coordinate system crosses a side of the closed polygon at most twice. So, what is a bitonic tour of exactly two points? Clearly, any two points form a (degenerate) bitonic tour. Three points have two bitonic tours ("clockwise" and "counterclockwise").

Now, how can you pre-compute the various smaller bitonic tours and combine them until you have all points included and still have a bitonic tour?


Okay, you're on the righ track with your update. But now, in a dynamic programming solution, what you do with work it bottom-up: pre-compute and memoize (not "memorize") the optimal subproblems.

Charlie Martin
A: 

This page looks to have quite a detailed analysis of the problem, as well as working Perl code to solve it.

j_random_hacker