|Date||May 05, 2014|
|Title||Learning Structured, Robust, and Multimodal Models (Technical Talk)|
|Abstract||Many powerful Monte Carlo techniques for estimating partition functions, suchas annealed importance sampling (AIS), are based on sampling from a sequence ofintermediate distributions which interpolate between a tractable initialdistribution and the intractable target distribution. The near-universalpractice is to use geometric averages of the initial and target distributions,but alternative paths can perform substantially better. We present a novelsequence of intermediate distributions for exponential families defined byaveraging the moments of the initial and target distributions. We analyze theasymptotic performance of both the geometric and moment averages paths andderive an asymptotically optimal piecewise linear schedule. AIS with momentaveraging performs well empirically at estimating partition functions ofrestricted Boltzmann machines (RBMs), which form the building blocks of manydeep learning models.|
|Bio||Ruslan Salakhutdinov received his PhD in machine learning (computer science)from the University of Toronto in 2009. After spending two post-doctoral yearsat the Massachusetts Institute of Technology Artificial Intelligence Lab, hejoined the University of Toronto as an Assistant Professor in the Department ofComputer Science and Department of Statistics. Dr. Salakhutdinov's primaryinterests lie in statistical machine learning, Bayesian statistics, DeepLearning, and large-scale optimization. He is the recipient of the EarlyResearcher Award, Connaught New Researcher Award, Alfred P. Sloan ResearchFellowship, Microsoft Faculty Fellowship, Google Faculty Award, and a Fellow ofthe Canadian Institute for Advanced Research.|
NOTE:Please note that this is a joint ANC/ILCC Seminar. There will be a less technical talk on Tuesday at 11 a.m. in IF-4.31/4.33.