Variance Reduction in Black-box Variational Inference by Adaptive Importance Sampling
Variance Reduction in Black-box Variational Inference by Adaptive Importance Sampling
Ximing Li, Changchun Li, Jinjin Chi, Jihong Ouyang
Proceedings of the Twenty-Seventh International Joint Conference on Artificial Intelligence
Main track. Pages 2404-2410.
https://doi.org/10.24963/ijcai.2018/333
Overdispersed black-box variational inference employs importance sampling to reduce the variance of the Monte Carlo gradient in black-box variational inference. A simple overdispersed proposal distribution is used. This paper aims to investigate how to adaptively obtain better proposal distribution for lower variance. To this end, we directly approximate the optimal proposal in theory using a Monte Carlo moment matching step at each variational iteration. We call this adaptive proposal moment matching proposal (MMP). Experimental results on two Bayesian models show that the MMP can effectively reduce variance in black-box learning, and perform better than baseline inference algorithms.
Keywords:
Machine Learning: Learning Generative Models
Machine Learning: Probabilistic Machine Learning