Inverted Bilingual Topic Models for Lexicon Extraction from Non-parallel Data

Inverted Bilingual Topic Models for Lexicon Extraction from Non-parallel Data

Tengfei Ma, Tetsuya Nasukawa

Proceedings of the Twenty-Sixth International Joint Conference on Artificial Intelligence
Main track. Pages 4075-4081. https://doi.org/10.24963/ijcai.2017/569

Topic models have been successfully applied in lexicon extraction. However, most previous methods are limited to document-aligned data. In this paper, we try to address two challenges of applying topic models to lexicon extraction in non-parallel data: 1) hard to model the word relationship and 2) noisy seed dictionary. To solve these two challenges, we propose two new bilingual topic models to better capture the semantic information of each word while discriminating the multiple translations in a noisy seed dictionary. We extend the scope of topic models by inverting the roles of "word" and "document". In addition, to solve the problem of noise in seed dictionary, we incorporate the probability of translation selection in our models. Moreover, we also propose an effective measure to evaluate the similarity of words in different languages and select the optimal translation pairs. Experimental results using real world data demonstrate the utility and efficacy of the proposed models.
Keywords:
Natural Language Processing: Information Extraction
Uncertainty in AI: Graphical Models