Neural Entity Summarization with Joint Encoding and Weak Supervision
Neural Entity Summarization with Joint Encoding and Weak Supervision
Junyou Li, Gong Cheng, Qingxia Liu, Wen Zhang, Evgeny Kharlamov, Kalpa Gunaratna, Huajun Chen
Proceedings of the Twenty-Ninth International Joint Conference on Artificial Intelligence
Main track. Pages 1644-1650.
https://doi.org/10.24963/ijcai.2020/228
In a large-scale knowledge graph (KG), an entity is often described by a large number of triple-structured facts. Many applications require abridged versions of entity descriptions, called entity summaries. Existing solutions to entity summarization are mainly unsupervised. In this paper, we present a supervised approach NEST that is based on our novel neural model to jointly encode graph structure and text in KGs and generate high-quality diversified summaries. Since it is costly to obtain manually labeled summaries for training, our supervision is weak as we train with programmatically labeled data which may contain noise but is free of manual work. Evaluation results show that our approach significantly outperforms the state of the art on two public benchmarks.
Keywords:
Knowledge Representation and Reasoning: Semantic Web
Machine Learning: Deep Learning
Machine Learning: Learning Preferences or Rankings