next up previous contents
Next: Back propagation in Up: Simple recurrence Previous: Simple recurrence

Example

Servan-Schreiber et al. present an example in which a network learns strings generated by a small finite state grammar.

 

Each state of the grammar provides a two way choice, and a training set is generated by taking such a decision with probability 0.5; this set has strings of variable length. A neural net with 6 input units (5 letters plus the `begin' state) and 6 outputs (5 letters plus the `end' state) is used; there were 3 hidden units used to augment the input.

The trained network can determine the grammatical validity of unseen strings, and demonstrated 100% performance. The authors demonstrate that its internal states came to represent the grammar accurately.



Bob Fisher
Mon Aug 4 14:24:13 BST 1997