Yun Yang, University of Illinois Urbana-Champaign
Talk Title: “Mean-field Variational Inference via Wasserstein Gradient Flow”
Abstract: Variational inference provides an appealing alternative to traditional sampling-based approaches for implementing Bayesian inference due to its conceptual simplicity, statistical accuracy and computational scalability. However, common variational approximation schemes, such as the mean-field (MF) approximation, still require certain conjugacy structure to facilitate efficient computation, which may add unnecessary restrictions to the viable prior distribution family and impose further constraints on the variational approximation family. In this work, we develop a general computational framework for implementing MF variational inference via Wasserstein gradient flow (WGF), a modern mathematical technique for realizing a gradient flow over the space of probability measures. When specialized to a common class of Bayesian latent variable models, the proposed algorithm becomes an alternating minimization scheme based on a time-discretized WGF, which resembles a distributional version of the classical Expectation–Maximization algorithm, consisting of an E-step of updating the latent variable variational distribution and an M-step of conducting steepest descent over the variational distribution of model parameters. For this class of models, we analyze the algorithmic convergence of the algorithm using tools from optimal transport theory and subdifferential calculus in the space of probability measures.