Utilizing Non-Parallel Text for Style Transfer by Making Partial Comparisons
Utilizing Non-Parallel Text for Style Transfer by Making Partial Comparisons
Di Yin, Shujian Huang, Xin-Yu Dai, Jiajun Chen
Proceedings of the Twenty-Eighth International Joint Conference on Artificial Intelligence
Main track. Pages 5379-5386.
https://doi.org/10.24963/ijcai.2019/747
Text style transfer aims to rephrase a given sentence into a different style without changing its original content. Since parallel corpora (i.e. sentence pairs with the same content but different styles) are usually unavailable, most previous works solely guide the transfer process with distributional information, i.e. using style-related classifiers or language models, which neglect the correspondence of instances, leading to poor transfer performance, especially for the content preservation. In this paper, we propose making partial comparisons to explicitly model the content and style correspondence of instances, respectively. To train the partial comparators, we propose methods to extract partial-parallel training instances automatically from the non-parallel data, and to further enhance the training process by using data augmentation. We perform experiments that compare our method to other existing approaches on two review datasets. Both automatic and manual evaluations show that our approach can significantly improve the performance of existing adversarial methods, and outperforms most state-of-the-art models. Our code and data will be available on Github.
Keywords:
Natural Language Processing: Natural Language Generation
Natural Language Processing: Natural Language Processing
Natural Language Processing: NLP Applications and Tools