More is Better: Deep Domain Adaptation with Multiple Sources

More is Better: Deep Domain Adaptation with Multiple Sources

Sicheng Zhao, Hui Chen, Hu Huang, Pengfei Xu, Guiguang Ding

Proceedings of the Thirty-Third International Joint Conference on Artificial Intelligence
Survey Track. Pages 8354-8362. https://doi.org/10.24963/ijcai.2024/923

In many practical applications, it is often difficult and expensive to obtain large-scale labeled data to train state-of-the-art deep neural networks. Therefore, transferring the learned knowledge from a separate, labeled source domain to an unlabeled or sparsely labeled target domain becomes an appealing alternative. However, direct transfer often results in significant performance decay due to domain shift. Domain adaptation (DA) aims to address this problem by aligning the distributions between the source and target domains. Multi-source domain adaptation (MDA) is a powerful and practical extension in which the labeled data may be collected from multiple sources with different distributions. In this survey, we first define various MDA strategies. Then we systematically summarize and compare modern MDA methods in the deep learning era from different perspectives, followed by commonly used datasets and a brief benchmark. Finally, we discuss future research directions for MDA that are worth investigating.
Keywords:
Machine Learning: ML: Multi-task and transfer learning
Computer Vision: CV: Transfer, low-shot, semi- and un- supervised learning