Optimal Graph Learning and Nuclear Norm Maximization for Deep Cross-Domain Robust Label Propagation

Optimal Graph Learning and Nuclear Norm Maximization for Deep Cross-Domain Robust Label Propagation

Wei Wang, Hanyang Li, Ke Shi, Chao Huang, Yang Cao, Cong Wang, Xiaochun Cao

Proceedings of the Thirty-Third International Joint Conference on Artificial Intelligence
Main Track. Pages 1407-1415. https://doi.org/10.24963/ijcai.2024/156

Domain adaptation aims to achieve label transfer from a labeled source domain to an unlabeled target domain, where the two domains exhibit different distributions. Existing methods primarily concentrate on designing a feature extractor to learn better domain-invariant features, along with developing an effective classifier for reliable predictions. In this paper, we introduce optimal graph learning to generate a cross-domain graph that effectively connects the two domains, and two domain-specific graphs to capture domain-specific structures. On the one hand, we incorporate the three graphs into the label propagation (LP) classifier to enhance its robustness to distribution difference. On the other hand, we leverage the three graphs to introduce graph embedding losses, promoting the learning of locally discriminative and domain-invariant features. Furthermore, we maximize the nuclear norm of predictions in LP to enhance class diversity, thereby improving its robustness to class imbalance problem. Correspondingly, we develop an efficient algorithm to solve the associated optimization problem. Finally, we integrate the proposed LP and graph embedding losses into a deep neural network, resulting in our proposed deep cross-domain robust LP. Extensive experiments conducted on three cross-domain benchmark datasets demonstrate that our proposed approach could outperform existing state-of-the-art domain adaptation methods.
Keywords:
Computer Vision: CV: Transfer, low-shot, semi- and un- supervised learning   
Machine Learning: ML: Classification
Machine Learning: ML: Feature extraction, selection and dimensionality reduction