Abstract: Stein's method is a remarkable theoretical tool in probability theory for establishing approximation and limit theorems or error bounds. Although it has been mostly known to theoreticians, recent advances have shown that some of its key ideas can also be extremely useful for addressing the practical computational challenges in highly complex probabilistic models and Bayesian inference. In this talk, we will discuss a framework that achieves this by combing Stein's operator with reproducing kernel Hilbert space (RKHS). At the heart of this framework is a kernelized Stein discrepancy measure that allows us to access the compatibility between data and distributions based on the Fisher's score function, without knowing the normalization constants that are often the computation bottleneck for complex models. Kernelized Stein discrepancy also corresponds to a type of functional gradient of KL divergence and draws intriguing connections with measure transport and variational inference. Our framework allows us to derive a number of practical algorithms for a variety of challenging statistical tasks, including goodness-of-fit tests for evaluating models with intractable normalization constants, scalable Bayesian inference combining the advantages of variational inference, MCMC and gradient-based optimization, and approximate maximum likelihood algorithms for training deep generative models.
10 February 2017
12:00 pm to 1:00 pm
'On Stein's method for practical statistical computation'
Qiang Liu, Dartmouth College