next up previous contents
Next: Example Up: Recurrent networks Previous: Example

Back propagation in time

An alternative is to use the output of the network as the input of another `iteration'. This procedure -- called `back propagation in time' or `recurrent back propagation' -- was suggested by Rumelhart et al.

It can be seen that such a network can be represented as a suitable number of replications of itself; this `expanded' net can then be trained using the normal algorithm, and the weights of the actual links may then be calculated by averaging their representations in the expanded form.





Bob Fisher
Mon Aug 4 14:24:13 BST 1997