Navigating Continual Test-time Adaptation with Symbiosis Knowledge

Navigating Continual Test-time Adaptation with Symbiosis Knowledge

Xu Yang, Moqi Li, Jie Yin, Kun Wei, Cheng Deng

Proceedings of the Thirty-Third International Joint Conference on Artificial Intelligence
Main Track. Pages 5326-5334. https://doi.org/10.24963/ijcai.2024/589

Continual test-time domain adaptation seeks to adapt the source pre-trained model to a continually changing target domain without incurring additional data acquisition or labeling costs. Unfortunately, existing mainstream methods may result in a detrimental cycle. This is attributed to noisy pseudo-labels caused by the domain shift, which immediately negatively impacts the model's knowledge. The long-term accumulation of these negative effects exacerbates the model's difficulty in generalizing to future domain shifts and contributes to catastrophic forgetting. To address these challenges, this paper introduces a Dual-stream Network that independently optimizes different parameters in each stream to capture symbiotic knowledge from continual domains, thereby ensuring generalization while enhancing instantaneous discrimination. Furthermore, to prevent catastrophic forgetting, a weighted soft parameter alignment method is designed to leverage knowledge from the source model. Finally, efforts are made to calibrate and explore reliable supervision signals to mitigate instantaneous negative optimization. These include label calibration with prior knowledge, label selection using self-adaptive confidence thresholds, and a soft-weighted contrastive module for capturing potential semantics. Extensive experimental results demonstrate that our method achieves state-of-the-art performance on several benchmark datasets.
Keywords:
Machine Learning: ML: Unsupervised learning
Computer Vision: CV: Transfer, low-shot, semi- and un- supervised learning