next up previous contents
Next: Boltzmann machines Up: Hopfield nets Previous: Hopfield nets

Hopfield nets -- example

The problem with using Hopfield nets and their derivatives lies in mapping particular problems onto the network architecture, since no learning is involved. One well known application is Hopfield and Tank's solution of the widely quoted Travelling Salesman's Problem, perhaps the best known of large scale cost minimisation problems.

The approach, for an n city problem, is to construct a matrix of neurons. Each row corresponds to one city, and a 1 in a column j indicates that that city should be the on the tour. The problem is constrained by the matrix needing to represent a permutation (that is, precisely one 1 per row and column), and the desire to minimise the total distance travelled. The cost function

achieves this; the first two terms when minimised permit at most one 1 per row and column, while the third encourages n 1's overall. The final term measures the total distance travelled.

Recall that in the original formulation, we derived the weights from known patterns and then defined energy. We can take this formula and call it energy, deduce what the weights ought to be to represent such a net, and then let it settle into stable patterns which ought to provide ``good'' tours. The formulae are;

where if and only if i=j, and 0 otherwise.

This has been seen to work in practice, but the solutions are very sensitive to choice of the constants A,B,C,D, since they represent conflicting aims.



Bob Fisher
Mon Aug 4 14:24:13 BST 1997