RAIN: RegulArization on Input and Network for Black-Box Domain Adaptation
RAIN: RegulArization on Input and Network for Black-Box Domain Adaptation
Qucheng Peng, Zhengming Ding, Lingjuan Lyu, Lichao Sun, Chen Chen
Proceedings of the Thirty-Second International Joint Conference on Artificial Intelligence
Main Track. Pages 4118-4126.
https://doi.org/10.24963/ijcai.2023/458
Source-Free domain adaptation transits the source-trained model towards target domain without exposing the source data, trying to dispel these concerns about data privacy and security. However, this paradigm is still at risk of data leakage due to adversarial attacks on the source model. Hence, the Black-Box setting only allows to use the outputs of source model, but still suffers from overfitting on the source domain more severely due to source model's unseen weights. In this paper, we propose a novel approach named RAIN (RegulArization on Input and Network) for Black-Box domain adaptation from both input-level and network-level regularization. For the input-level, we design a new data augmentation technique as Phase MixUp, which highlights task-relevant objects in the interpolations, thus enhancing input-level regularization and class consistency for target models. For network-level, we develop a Subnetwork Distillation mechanism to transfer knowledge from the target subnetwork to the full target network via knowledge distillation, which thus alleviates overfitting on the source domain by learning diverse target representations. Extensive experiments show that our method achieves state-of-the-art performance on several cross-domain benchmarks under both single- and multi-source black-box domain adaptation.
Keywords:
Machine Learning: ML: Multi-task and transfer learning
Computer Vision: CV: Transfer, low-shot, semi- and un- supervised learning