An Adaptive Hierarchical Compositional Model for Phrase Embedding
An Adaptive Hierarchical Compositional Model for Phrase Embedding
Bing Li, Xiaochun Yang, Bin Wang, Wei Wang, Wei Cui, Xianchao Zhang
Proceedings of the Twenty-Seventh International Joint Conference on Artificial Intelligence
Main track. Pages 4144-4151.
https://doi.org/10.24963/ijcai.2018/576
Phrase embedding aims at representing phrases in a vector space and it is important for the performance of many NLP tasks. Existing models only regard a phrase as either full-compositional or non-compositional, while ignoring the hybrid-compositionality that widely exists, especially in long phrases. This drawback prevents them from having a deeper insight into the semantic structure for long phrases and as a consequence, weakens the accuracy of the embeddings. In this paper, we present a novel method for jointly learning compositionality and phrase embedding by adaptively weighting different compositions using an implicit hierarchical structure. Our model has the ability of adaptively adjusting among different compositions without entailing too much model complexity and time cost. To the best of our knowledge, our work is the first effort that considers hybrid-compositionality in phrase embedding. The experimental evaluation demonstrates that our model outperforms state-of-the-art methods in both similarity tasks and analogy tasks.
Keywords:
Natural Language Processing: Natural Language Processing
Natural Language Processing: Embeddings