## Description

In problem-set 6, we saw that neural networks with feedback (recurrent neural nets) can

be used for error-tolerant data encoding, content-addressable memory, and recovery of

complete memories based on incomplete or flawed cues. In this problem set, recurrent

neural nets will be used as a means for solving combinatorial problems.

This assignment follows the presentation by Hopfield and Tank in ‘Neural’ Computation

of Decisions in Optimization Problems, which is posted on Blackboard. Some example

code is on the class website, along with data describing a 10-city distance matrix,

“intercity_distances.dat”. All 10 of the hypothetical cities lie on a plane within a square

from (0.0,0.0) to (1.0,1.0). For these types of problems, people are good at visualizing

effective solutions. However, the distance map need not be so geometrically intuitive,

but could represent transition costs that do not scale intuitively with distance (e.g., as in

airline pricing). In such non-geometric cases, people are not good at visualizing good

solutions.

Since the 10-city tour is presumed to be cyclic (return to the start city after visiting all

cities), the choice of the start city is irrelevant. For example, a tour that travels among

cities in the order A,B,C,D,E,F,G,H,I,J,A will have exactly the same path length as the

tour B,C,D,E,F,G,H,I,J,A,B. For the 10-city tour, there are 10 such identical sequences.

For a symmetric distance map (cost of travel from A to B is identical to the cost of travel

from B to A), there are another 10 tours in the reverse order that are identical in cost.

It is thus sufficiently general to mandate starting from city 1 and ending at city 1, which

reduces the search size to 9!=362,880 permutations. This is a relatively small solution set

that can be evaluated exhaustively. However, adding just a few more cities makes the

problem intractable, and one requires some other optimization technique.

An exhaustive search of legal trips for “intercity_distances.dat” yields the following

histogram of trips:

Statistics of the legal solutions are:

mean(all_solutions) = 6.1484

std(all_solutions) = 0.6230

min(all_solutions) = 3.7309

max(all_solutions) = 8.0657

You should use a Hopfield net to attempt to find valid and low-cost solutions. This will

require some programming, some tuning and experimentation.

Note that in Hopfield and Tank’s paper, equation 12 contains a term: -Uxi/. You can try

using this term, or try eliminating it. Also, you will need to pick values for the weights

A, B, C, and D and a choice for how to initialize the U values.. You will also need to

pick a time step for integrating values of U based on dU/dt. (I have suggested a value, but

this might depend on your other parameter choices). Further, note that Hopfield and

Tank commented that they used a larger value for the current-source terms than that

corresponding to their analytic derivation (page 147).

You might find that your set of outputs settles on values that do not satisfy the properties

of a valid path. The starter code contains an optional variation to include integral-error

feedback. Experiment with this option and comment on its utility.

Report on the following results from your trials:

Describe statistics for performance of your neural-net computation of solutions.

Compare to the exhaustive solution set.

Report on your choices for all parameters and describe how/why you chose these

parameters in terms of your network performance. Describe any variations you

introduced.

Comment on convergence to legal tours and the use of integral-error feedback.