Robust Contrastive Multi-view Kernel Clustering

Robust Contrastive Multi-view Kernel Clustering

Peng Su, Yixi Liu, Shujian Li, Shudong Huang, Jiancheng Lv

Proceedings of the Thirty-Third International Joint Conference on Artificial Intelligence
Main Track. Pages 4938-4945. https://doi.org/10.24963/ijcai.2024/546

Multi-view kernel clustering (MKC) aims to fully reveal the consistency and complementarity of multiple views in a potential Hilbert space, thereby enhancing clustering performance. The clustering results of most MKC methods are highly sensitive to the quality of the constructed kernels, as traditional methods independently compute kernel matrices for each view without fully considering complementary information across views. In previous contrastive multi-view kernel learning, the goal was to bring cross-view instances of the same sample closer during the kernel construction process while pushing apart instances across samples to achieve a comprehensive integration of cross-view information. However, its inherent drawback is the potential inappropriate amplification of distances between different instances of the same clusters (i.e., false negative pairs) during the training process, leading to a reduction in inter-class discriminability. To address this challenge, we propose a Robust Contrastive multi-view kernel Learning approach (R-CMK) against false negative pairs. It partitions negative pairs into different intervals based on distance or similarity, and for false negative pairs, reverses their optimization gradient. This effectively avoids further amplification of distances for false negative pairs while simultaneously pushing true negative pairs farther apart. We conducted comprehensive experiments on various MKC methods to validate the effectiveness of the proposed method. The code is available at https://github.com/Duo-laimi/rcmk_main.
Keywords:
Machine Learning: ML: Multi-view learning
Machine Learning: ML: Clustering
Machine Learning: ML: Kernel methods