Abstract Rule Learning for Paraphrase Generation

Abstract Rule Learning for Paraphrase Generation

Xianggen Liu, Wenqiang Lei, Jiancheng Lv, Jizhe Zhou

Proceedings of the Thirty-First International Joint Conference on Artificial Intelligence
Main Track. Pages 4273-4279. https://doi.org/10.24963/ijcai.2022/593

In early years, paraphrase generation typically adopts rule-based methods, which are interpretable and able to make global transformations to the original sentence. But they struggle to produce fluent paraphrases. Recently, deep neural networks have shown impressive performances in generating paraphrases. However, the current neural models are black boxes and are prone to make local modifications to the inputs. In this work, we combine these two approaches into RULER, a novel approach that performs abstract rule learning for paraphrasing. The key idea is to explicitly learn generalizable rules that could enhance the paraphrase generation process of neural networks. In RULER, we first propose a rule generalizability metric to guide the model to generate rules underlying the paraphrasing. Then, we leverage neural networks to generate paraphrases by refining the sentences transformed by the learned rules. Extensive experimental results demonstrate the superiority of RULER over previous state-of-the-art methods in terms of paraphrase quality, generalization ability and interpretability.
Keywords:
Natural Language Processing: Language Generation
Natural Language Processing: Summarization
Knowledge Representation and Reasoning: Learning and reasoning