Self-supervised learning for time series analysis: Taxonomy, progress, and prospects

K Zhang, Q Wen, C Zhang, R Cai, M **… - IEEE transactions on …, 2024 - ieeexplore.ieee.org
Self-supervised learning (SSL) has recently achieved impressive performance on various
time series tasks. The most prominent advantage of SSL is that it reduces the dependence …

Deep learning for time series classification and extrinsic regression: A current survey

N Mohammadi Foumani, L Miller, CW Tan… - ACM Computing …, 2024 - dl.acm.org
Time Series Classification and Extrinsic Regression are important and challenging machine
learning tasks. Deep learning has revolutionized natural language processing and computer …

Self-supervised contrastive pre-training for time series via time-frequency consistency

X Zhang, Z Zhao, T Tsiligkaridis… - Advances in neural …, 2022 - proceedings.neurips.cc
Pre-training on time series poses a unique challenge due to the potential mismatch between
pre-training and target domains, such as shifts in temporal dynamics, fast-evolving trends …

Simmtm: A simple pre-training framework for masked time-series modeling

J Dong, H Wu, H Zhang, L Zhang… - Advances in Neural …, 2023 - proceedings.neurips.cc
Time series analysis is widely used in extensive areas. Recently, to reduce labeling
expenses and benefit various tasks, self-supervised pre-training has attracted immense …

Test: Text prototype aligned embedding to activate llm's ability for time series

C Sun, H Li, Y Li, S Hong - arxiv preprint arxiv:2308.08241, 2023 - arxiv.org
This work summarizes two ways to accomplish Time-Series (TS) tasks in today's Large
Language Model (LLM) context: LLM-for-TS (model-centric) designs and trains a …

Contrast everything: A hierarchical contrastive framework for medical time-series

Y Wang, Y Han, H Wang… - Advances in Neural …, 2023 - proceedings.neurips.cc
Contrastive representation learning is crucial in medical time series analysis as it alleviates
dependency on labor-intensive, domain-specific, and scarce expert annotations. However …

A cookbook of self-supervised learning

R Balestriero, M Ibrahim, V Sobal, A Morcos… - arxiv preprint arxiv …, 2023 - arxiv.org
Self-supervised learning, dubbed the dark matter of intelligence, is a promising path to
advance machine learning. Yet, much like cooking, training SSL methods is a delicate art …

[HTML][HTML] Exploring simple triplet representation learning

Z Ren, Q Lan, Y Zhang, S Wang - Computational and Structural …, 2024 - Elsevier
Fully supervised learning methods necessitate a substantial volume of labelled training
instances, a process that is typically both labour-intensive and costly. In the realm of medical …

Toward a foundation model for time series data

CCM Yeh, X Dai, H Chen, Y Zheng, Y Fan… - Proceedings of the …, 2023 - dl.acm.org
A foundation model is a machine learning model trained on a large and diverse set of data,
typically using self-supervised learning-based pre-training techniques, that can be adapted …

Unsupervised representation learning for time series: A review

Q Meng, H Qian, Y Liu, Y Xu, Z Shen, L Cui - arxiv preprint arxiv …, 2023 - arxiv.org
Unsupervised representation learning approaches aim to learn discriminative feature
representations from unlabeled data, without the requirement of annotating every sample …