Contrastive Representation Learning for Self-Supervised Taxonomy Completion

Contrastive Representation Learning for Self-Supervised Taxonomy Completion

Yuhang Niu, Hongyuan Xu, Ciyi Liu, Yanlong Wen, Xiaojie Yuan

Proceedings of the Thirty-Third International Joint Conference on Artificial Intelligence
Main Track. Pages 6442-6450. https://doi.org/10.24963/ijcai.2024/712

Taxonomy completion, a self-supervised task, aims to add new concepts to an existing taxonomy by attaching them to appropriate hypernym and hyponym pairs. Researchers have proposed several approaches to capture the essential relationships in taxonomy using semantic or structural information. However, they either construct training signals from a single view or simply use a random sampling strategy, making it insufficient to capture various relations in taxonomic structure and learn quality representations. To address this, we propose CoSTC, a contrastive learning framework that captures diverse relations and improves representations for taxonomy completion. It uses two contrasting views, namely intra-view and inter-view, to provide rich self-supervised signals. In intra-view contrasting, we exploit the correlations within queries and within positions by performing instance-level discrimination task. In inter-view contrasting, we use a sampling strategy that considers diversity and hardness to select representative pairs, enhancing the learning of fine-grained query-position relations. Experimental results on three datasets verify the effectiveness of our approach. Our code is available at https://github.com/nyh-a/CoSTC.
Keywords:
Natural Language Processing: NLP: Information extraction