Rethinking Correlation Learning via Label Prior for Open Set Domain Adaptation

Rethinking Correlation Learning via Label Prior for Open Set Domain Adaptation

Zi-Xian Huang, Chuan-Xian Ren

Proceedings of the Thirty-Third International Joint Conference on Artificial Intelligence
Main Track. Pages 884-892. https://doi.org/10.24963/ijcai.2024/98

Open Set Domain Adaptation (OSDA) aims to transfer knowledge from a labeled source domain to an unlabeled target domain, where known classes exist across domains while unknown classes are present only in the target domain. Existing methods rely on the clustering structure to identify the unknown classes, which empirically induces a large identification error if the unknown classes are a mixture of multiple components. To break through this barrier, we formulate OSDA from the view of correlation and propose a correlation metric-based framework called Balanced Correlation Learning (BCL). BCL employs Hilbert-Schmidt Independence Criterion (HSIC) to characterize the separation between unknown and known classes, where HSIC is reformulated as the nodes’ relation on graph. By considering the label prior as variable, theoretical results are derived to analytically show a sufficient condition for desired learning direction for OSDA. Methodologically, the class-balanced HSIC is proposed to preserve domain-invariant and class-discriminative features. With the guarantee of correlation learning, the entropy-based principle can effectively identify the unknown classes via uncertainty. Empirically, extensive evaluations are conducted, where BCL achieves significant performance improvements.
Keywords:
Computer Vision: CV: Machine learning for vision
Computer Vision: CV: Transfer, low-shot, semi- and un- supervised learning