Alleviating Imbalanced Pseudo-label Distribution: Self-Supervised Multi-Source Domain Adaptation with Label-specific Confidence
Alleviating Imbalanced Pseudo-label Distribution: Self-Supervised Multi-Source Domain Adaptation with Label-specific Confidence
Shuai Lü, Meng Kang, Ximing Li
Proceedings of the Thirty-Third International Joint Conference on Artificial Intelligence
Main Track. Pages 4669-4677.
https://doi.org/10.24963/ijcai.2024/516
The existing self-supervised Multi-Source Domain Adaptation (MSDA) methods often suffer an imbalanced characteristic among the distribution of pseudo-labels. Such imbalanced characteristic results in many labels with too many or too few pseudo-labeled samples on the target domain, referred to as easy-to-learn label and hard-to-learn label, respectively. Both of these labels hurt the generalization performance on the target domain. To alleviate this problem, in this paper we propose a novel multi-source domain adaptation method, namely Self-Supervised multi-Source Domain Adaptation with Label-specific Confidence (S3DA-LC). Specifically, we estimate the label-specific confidences, i.e., the learning difficulties of labels, and adopt them to generate the pseudo-labels for target samples, enabling to simultaneously constrain and enrich the pseudo supervised signals for easy-to-learn and hard-to-learn labels. We evaluate S3DA-LC on several benchmark datasets, indicating its superior performance compared with the existing MSDA baselines.
Keywords:
Machine Learning: ML: Multi-task and transfer learning
Machine Learning: ML: Classification
Machine Learning: ML: Self-supervised Learning