Abstract
Fast Laplace Approximation for Sparse Bayesian Spike and Slab Models / 1867
Syed Abbas Z. Naqvi, Shandian Zhe, Yuan Qi, Yifan Yang, Jieping Ye
We consider the application of Bayesian spike-and-slab models in high-dimensional feature selection problems. To do so, we propose a simple yet effective fast approximate Bayesian inference algorithm based on Laplace's method. We exploit two efficient optimization methods, GIST and L-BFGS, to obtain the mode of the posterior distribution. Then we propose an ensemble Nystrom based approach to calculate the diagonal of the inverse Hessian over the mode to obtain the approximate posterior marginals in O(knp) time, k ≪ p. Furthermore, we provide the theoretical analysis about the estimation consistency and approximation error bounds. With the posterior marginals of the model weights, we use quadrature integration to estimate the marginal posteriors of selection probabilities and indicator variables for all features, which quantify the selection uncertainty. Our method not only maintains the benefits of the Bayesian treatment (e.g., uncertainty quantification) but also possesses the computational efficiency, and oracle properties of the frequentist methods. Simulation shows that our method estimates better or comparable selection probabilities and indicator variables than alternative approximate inference methods such as VB and EP, but with less running time. Extensive experiments on large real datasets demonstrate that our method often improves prediction accuracy over Bayesian automatic relevance determination, EP, and frequentist L1 type methods.