SpeakerTony Robinson
DateJan 31, 2014
Time11:00AM 12:00PM
LocationIF-4.31/4.33
TitleWill recurrent neural network language models scale?
AbstractIn “Up from trigrams! The struggle for improved language models” (1991)Fred Jelinek described the first use of trigrams in 1976 and thenlamented “The surprising fact is that now, a full 15 years later, afterall the solid progress in speech recognition, the trigram model remainsfundamental”. Almost two decades later the situation was largelyunchanged but in 2010 Tomas Mikolov presented the "Recurrent neuralnetwork based language model” (RRN LM). After many decades we now have anew means for language modelling which is clearly much better than then-gram. Having actively pioneered the use of RNNs in the 80's and 90'sthe concern arises as to whether the RNNs will continue to outperform orwhether there will be another “neural net winter”. This talk addressthe problem of whether RNN LMs will scale by looking at the scalingproperties of n-grams, and then doing the same for RNN LMs. Scaling isconsidered in terms of LM words, number of parameters, processing powerand memory. Preliminary results will be presented showing the largestreductions in perplexity reported so far, an analysis of the performanceon frequent and rare words, results on the newly released1-billion-word-language-modelling-benchmark and the impact on worderror rates in a a commercial LVCSR system. The talk concludes byjustifying whether RNN LMs will scale with respect to the previouslyincumbent n-grams.

BioDr Tony Robinson is founder of Cantab Research. He previously founded and managed theAdvanced Speech Group at SpinVox/Nuance, SoftSound Ltd (now part ofAutonomy/HP), and the Connectionist Speech Recognition research group atCambridge University.

 

Previous Next

List