Deep Context: A Neural Language Model for Large-scale Networked Documents

Deep Context: A Neural Language Model for Large-scale Networked Documents

Hao Wu, Kristina Lerman

Proceedings of the Twenty-Sixth International Joint Conference on Artificial Intelligence
Main track. Pages 3091-3097. https://doi.org/10.24963/ijcai.2017/431

We propose a scalable neural language model that leverages the links between documents to learn the deep context of documents. Our model, Deep Context Vector, takes advantage of distributed representations to exploit the word order in document sentences, as well as the semantic connections among linked documents in a document network. We evaluate our model on large-scale data collections that include Wikipedia pages, and scientific and legal citations networks. We demonstrate its effectiveness and efficiency on document classification and link prediction tasks.
Keywords:
Machine Learning: Neural Networks
Natural Language Processing: Natural Language Semantics
Natural Language Processing: Natural Language Processing
Natural Language Processing: Text Classification