Proceedings Abstracts of the Twenty-Fifth International Joint Conference on Artificial Intelligence

Employing External Rich Knowledge for Machine Comprehension / 2929
Bingning Wang, Shangmin Guo, Kang Liu, Shizhu He, Jun Zhao

Recently proposed machine comprehension (MC) application is an effort to deal with natural language understanding problem. However, the small size of machine comprehension labeled data confines the application of deep neural networks architectures that have shown advantage in semantic inference tasks. Previous methods use a lot of NLP tools to extract linguistic features but only gain little improvement over simple baseline. In this paper, we build an attention-based recurrent neural network model, train it with the help of external knowledge which is semantically relevant to machine comprehension, and achieves a new state-of-art result.