Commonsense Knowledge Aware Conversation Generation with Graph Attention
Commonsense Knowledge Aware Conversation Generation with Graph Attention
Hao Zhou, Tom Young, Minlie Huang, Haizhou Zhao, Jingfang Xu, Xiaoyan Zhu
Proceedings of the Twenty-Seventh International Joint Conference on Artificial Intelligence
Main track. Pages 4623-4629.
https://doi.org/10.24963/ijcai.2018/643
Commonsense knowledge is vital to many natural language processing tasks. In this paper, we present a novel open-domain conversation generation model to demonstrate how large-scale commonsense knowledge can facilitate language understanding and generation. Given a user post, the model retrieves relevant knowledge graphs from a knowledge base and then encodes the graphs with a static graph attention mechanism, which augments the semantic information of the post and thus supports better understanding of the post. Then, during word generation, the model attentively reads the retrieved knowledge graphs and the knowledge triples within each graph to facilitate better generation through a dynamic graph attention mechanism. This is the first attempt that uses large-scale commonsense knowledge in conversation generation. Furthermore, unlike existing models that use knowledge triples (entities) separately and independently, our model treats each knowledge graph as a whole, which encodes more structured, connected semantic information in the graphs. Experiments show that the proposed model can generate more appropriate and informative responses than state-of-the-art baselines.
Keywords:
Natural Language Processing: Dialogue
Natural Language Processing: Natural Language Generation
Natural Language Processing: Natural Language Processing