An alternative is to use the output of the network as the input of another `iteration'. This procedure -- called `back propagation in time' or `recurrent back propagation' -- was suggested by Rumelhart et al.
It can be seen that such a network can be represented as a suitable number of replications of itself; this `expanded' net can then be trained using the normal algorithm, and the weights of the actual links may then be calculated by averaging their representations in the expanded form.