Topic-to-Essay Generation with Neural Networks

Topic-to-Essay Generation with Neural Networks

Xiaocheng Feng, Ming Liu, Jiahao Liu, Bing Qin, Yibo Sun, Ting Liu

Proceedings of the Twenty-Seventh International Joint Conference on Artificial Intelligence
Main track. Pages 4078-4084. https://doi.org/10.24963/ijcai.2018/567

 We focus on essay generation, which is a challenging task that generates a paragraph-level text with multiple topics.Progress towards understanding different topics and expressing diversity in this task requires more powerful generators and richer training and evaluation resources. To address this,  we develop a multi-topic aware long short-term memory (MTA-LSTM) network.In this model, we maintain a novel multi-topic coverage vector, which learns the weight of each topic and is sequentially updated during the decoding process.Afterwards this vector is fed to an attention model to guide the generator.Moreover, we automatically construct two paragraph-level Chinese essay corpora, 305,000 essay paragraphs and 55,000 question-and-answer pairs.Empirical results show that our approach obtains much better BLEU score compared to various baselines.Furthermore, human judgment shows that MTA-LSTM has the ability to generate essays that are not only coherent but also closely related to the input topics.
Keywords:
Machine Learning: Neural Networks
Natural Language Processing: Natural Language Generation
Machine Learning: Deep Learning