Gaussian Embedding of Linked Documents from a Pretrained Semantic Space
Gaussian Embedding of Linked Documents from a Pretrained Semantic Space
Antoine Gourru, Julien Velcin, Julien Jacques
Proceedings of the Twenty-Ninth International Joint Conference on Artificial Intelligence
Main track. Pages 3912-3918.
https://doi.org/10.24963/ijcai.2020/541
Gaussian Embedding of Linked Documents (GELD) is a new method that embeds linked documents (e.g., citation networks) onto a pretrained semantic space (e.g., a set of word embeddings).
We formulate the problem in such a way that we model each document as a Gaussian distribution in the word vector space.
We design a generative model that combines both words and links in a consistent way.
Leveraging the variance of a document allows us to model the uncertainty related to word and link generation.
In most cases, our method outperforms state-of-the-art methods when using our document vectors as features for usual downstream tasks.
In particular, GELD achieves better accuracy in classification and link prediction on Cora and Dblp. In addition, we demonstrate qualitatively the convenience of several properties of our method. We provide the implementation of GELD and the evaluation datasets to the community (https://github.com/AntoineGourru/DNEmbedding).
Keywords:
Natural Language Processing: Embeddings
Machine Learning: Probabilistic Machine Learning
Machine Learning: Learning Generative Models