The learning rule is used to define the weights of a Hopfield network. Incremental and local properties of the learning rule are very important in attractor neural networks. Locality provides a natural parallelism to the architecture, because each neuron can be seen as a separate processor in its own right. A local learning rule uses only the information available to neurons either side of a connection to update the weight of that connection.

An incremental system provides a straightforward adaptability: New attractors can be introduced without needing to know the information from which old ones were generated. The Hebbian learning rule is both local and incremental, and has been studied extensively. Many researchers have focussed their attention on assessing the capacity of the Hebbian rule. Capacity is important for an attractor network, because a higher capacity makes more efficient use of the processors. Two main types of capacity are commonly considered. These are often called absolute and relative capacity. Here only absolute capacity is considered. The absolute capacity of the Hebbian rule is n/(2 ln n) where n is the total number of neurons.

I have been involved in developing a new learning rule which keeps both the local and incremental properties of the Hebbian rule, but increases the absolute capacity to n/sqrt{2 ln n}.