Contrastive Learning Is Not Optimal for Quasiperiodic Time Series
Contrastive Learning Is Not Optimal for Quasiperiodic Time Series
Adrian Atienza, Jakob Bardram, Sadasivan Puthusserypady
Proceedings of the Thirty-Third International Joint Conference on Artificial Intelligence
Main Track. Pages 3661-3668.
https://doi.org/10.24963/ijcai.2024/405
Despite recent advancements in Self-Supervised Learning (SSL) for Time Series analysis, a noticeable gap persists between the anticipated achievements and actual performance. While these methods have demonstrated formidable generalization capabilities with minimal labels in various domains, their effectiveness in distinguishing between different classes based on a limited number of annotated records is notably lacking. Our hypothesis attributes this bottleneck to the prevalent use of Contrastive Learning, a shared training objective in previous state-of-the-art (SOTA) methods.
By mandating distinctiveness between representations for negative pairs drawn from separate records, this approach compels the model to encode unique record-based patterns but simultaneously neglects changes occurring across the entire record. To overcome this challenge, we introduce Distilled Embedding for Almost-Periodic Time Series (DEAPS) in this paper, offering a non-contrastive method tailored for quasiperiodic time series, such as electrocardiogram (ECG) data. By avoiding the use of negative pairs, we not only mitigate the model's blindness to temporal changes but also enable the integration of a "Gradual Loss (L_gra)" function. This function guides the model to effectively capture dynamic patterns evolving throughout the record. The outcomes are promising, as DEAPS demonstrates a notable improvement of +10% over existing SOTA methods when just a few annotated records are presented to fit a Machine Learning (ML) model based on the learned representation.
Keywords:
Machine Learning: ML: Self-supervised Learning
Machine Learning: ML: Time series and data streams
Multidisciplinary Topics and Applications: MTA: Health and medicine