Small-Variance Asymptotics for Nonparametric Bayesian Overlapping Stochastic Blockmodels
Small-Variance Asymptotics for Nonparametric Bayesian Overlapping Stochastic Blockmodels
Gundeep Arora, Anupreet Porwal, Kanupriya Agarwal, Avani Samdariya, Piyush Rai
Proceedings of the Twenty-Seventh International Joint Conference on Artificial Intelligence
Main track. Pages 2000-2006.
https://doi.org/10.24963/ijcai.2018/276
The latent feature relational model (LFRM) for graphs represents each node as having binary memberships in one or more communities. The community memberships can be represented in form of a binary vector and LFRM defines the link probability between any pair of nodes as a bilinear function of their community membership vectors. Moreover, using nonparametric Bayesian prior - Indian Buffet Process - on the community membership matrix enables learning the number of communities automatically from the data. However, despite its modeling flexibility, strong link predictive performance, and nice interpretability of binary embeddings, inference in LFRM remains a challenge and is typically done via MCMC or variational methods. These methods can be slow and may take a long time to converge. In this work, we apply the small variance asymptotics idea to the non-parametric Bayesian LFRM, utilizing the connection between exponential families and Bregman divergence. This leads to an overlapping k-means like objective function for the nonparametric Bayesian LFRM, which can be optimized using generic or specialized solvers. We also propose an iterative greedy algorithm to optimize the objective function and compare our approach with other inference methods on several benchmark datasets. Our results demonstrate that our inference algorithm is competitive to methods such as MCMC while being much faster.
Keywords:
Machine Learning: Relational Learning
Machine Learning: Probabilistic Machine Learning