Content Tags

There are no tags.

$\alpha$-Variational Inference with Statistical Guarantees.

RSS Source
Authors
Yun Yang, Debdeep Pati, Anirban Bhattacharya

We propose a family of variational approximations to Bayesian posteriordistributions, called $\alpha$-VB, with provable statistical guarantees. Thestandard variational approximation is a special case of $\alpha$-VB with$\alpha=1$. When $\alpha \in(0,1]$, a novel class of variational inequalitiesare developed for linking the Bayes risk under the variational approximation tothe objective function in the variational optimization problem, implying thatmaximizing the evidence lower bound in variational inference has the effect ofminimizing the Bayes risk within the variational density family. Operating in afrequentist setup, the variational inequalities imply that point estimatesconstructed from the $\alpha$-VB procedure converge at an optimal rate to thetrue parameter in a wide range of problems. We illustrate our general theorywith a number of examples, including the mean-field variational approximationto (low)-high-dimensional Bayesian linear regression with spike and slabpriors, mixture of Gaussian models, latent Dirichlet allocation, and (mixtureof) Gaussian variational approximation in regular parametric models.

Stay in the loop.

Subscribe to our newsletter for a weekly update on the latest podcast, news, events, and jobs postings.