Variational Mixtures of Gaussian Processes for Classification
Variational Mixtures of Gaussian Processes for Classification
Chen Luo, Shiliang Sun
Proceedings of the Twenty-Sixth International Joint Conference on Artificial Intelligence
Main track. Pages 4603-4609.
https://doi.org/10.24963/ijcai.2017/642
Gaussian Processes (GPs) are powerful tools for machine learning which have been applied to both classification and regression. The mixture models of GPs were later proposed to further improve GPs for data modeling. However, these models are formulated for regression problems. In this work, we propose a new Mixture of Gaussian Processes for Classification (MGPC). Instead of the Gaussian likelihood for regression, MGPC employs the logistic function as likelihood to obtain the class probabilities, which is suitable for classification problems. The posterior distribution of latent variables is approximated through variational inference. The hyperparameters are optimized through the variational EM method and a greedy algorithm. Experiments are performed on multiple real-world datasets which show improvements over five widely used methods on predictive performance. The results also indicate that for classification MGPC is significantly better than the regression model with mixtures of GPs, different from the existing consensus that their single model counterparts are comparable.
Keywords:
Uncertainty in AI: Approximate Probabilistic Inference
Uncertainty in AI: Graphical Models