Associative Memory: Hopfield model |
The Hopfield model is a distributed model of an associative memory. Neurons are pixels and can take the values of -1 (off) or +1 (on). The network has stored a certain number of pixel patterns. During a retrieval phase, the network is started with some initial configuration and the network dynamics evolves towards the stored pattern which is closest to the initial configuration.
In the Hopfield model each neuron is connected to every
other neuron (full connectivity). The connection matrix is
Wik= (1/N) Summ
Xim Xkm
where N is the number of neurons, Xkm
is the value of neuron k in pattern number m and the sum runs
over all patterns from m=1 to m=p. This is a simple correlation based learning
rule (Hebbian learning). Since it is not a iterative rule it is sometimes
called one-shot learning. The learning rule works best if the patterns
that are to be stored are random patterns with equal probability for on
(+1) and off (-1). In a large networks (N to infinity) the number of random
patterns that can be stored is approximately 0.14 times N.
Use the mouse to enter a pattern by clicking squares inside the rectangle "on" or "off". Then, have the network store your pattern by pressing "Memorize". After storing some patterns (typically two), try entering a new pattern which you will use as a test pattern. Do not impose this new pattern, but use it as an initial state of the network. Press "Test" repeatedly to watch the network settle into a previously imposed state.