Stay in the loop.

Subscribe to our newsletter for a weekly update on the latest podcast, news, events, and jobs postings.

Mini-batch Tempered MCMC.

RSS Source
Authors
Dangna Li, Wing H Wong

In this paper we propose a general framework of performing MCMC with only amini-batch of data. We show by estimating the Metropolis-Hasting ratio withonly a mini-batch of data, one is essentially sampling from the true posteriorraised to a known temperature. We show by experiments that our method,Mini-batch Tempered MCMC (MINT-MCMC), can efficiently explore multiple modes ofa posterior distribution. We demonstrate the application of MINT-MCMC as aninference tool for Bayesian neural networks. We also show an cyclic version ofour algorithm can be applied to build an ensemble of neural networks withlittle additional training cost. Based on the Equi-Energy sampler (Kou et al.2006), we developed a new parallel MCMC algorithm based on the Equi-Energysampler, which enables efficient sampling from high-dimensional multi-modalposteriors with well separated modes. We apply this algorithm to reconstructthe landscape of an energy function.