Self-supervised learning for time series analysis: Taxonomy, progress, and prospects
Self-supervised learning (SSL) has recently achieved impressive performance on various
time series tasks. The most prominent advantage of SSL is that it reduces the dependence …
time series tasks. The most prominent advantage of SSL is that it reduces the dependence …
Deep learning for time series classification and extrinsic regression: A current survey
Time Series Classification and Extrinsic Regression are important and challenging machine
learning tasks. Deep learning has revolutionized natural language processing and computer …
learning tasks. Deep learning has revolutionized natural language processing and computer …
Self-supervised contrastive pre-training for time series via time-frequency consistency
Pre-training on time series poses a unique challenge due to the potential mismatch between
pre-training and target domains, such as shifts in temporal dynamics, fast-evolving trends …
pre-training and target domains, such as shifts in temporal dynamics, fast-evolving trends …
Simmtm: A simple pre-training framework for masked time-series modeling
Time series analysis is widely used in extensive areas. Recently, to reduce labeling
expenses and benefit various tasks, self-supervised pre-training has attracted immense …
expenses and benefit various tasks, self-supervised pre-training has attracted immense …
Test: Text prototype aligned embedding to activate llm's ability for time series
This work summarizes two ways to accomplish Time-Series (TS) tasks in today's Large
Language Model (LLM) context: LLM-for-TS (model-centric) designs and trains a …
Language Model (LLM) context: LLM-for-TS (model-centric) designs and trains a …
Contrast everything: A hierarchical contrastive framework for medical time-series
Y Wang, Y Han, H Wang… - Advances in Neural …, 2023 - proceedings.neurips.cc
Contrastive representation learning is crucial in medical time series analysis as it alleviates
dependency on labor-intensive, domain-specific, and scarce expert annotations. However …
dependency on labor-intensive, domain-specific, and scarce expert annotations. However …
A cookbook of self-supervised learning
Self-supervised learning, dubbed the dark matter of intelligence, is a promising path to
advance machine learning. Yet, much like cooking, training SSL methods is a delicate art …
advance machine learning. Yet, much like cooking, training SSL methods is a delicate art …
[HTML][HTML] Exploring simple triplet representation learning
Fully supervised learning methods necessitate a substantial volume of labelled training
instances, a process that is typically both labour-intensive and costly. In the realm of medical …
instances, a process that is typically both labour-intensive and costly. In the realm of medical …
Toward a foundation model for time series data
A foundation model is a machine learning model trained on a large and diverse set of data,
typically using self-supervised learning-based pre-training techniques, that can be adapted …
typically using self-supervised learning-based pre-training techniques, that can be adapted …
Unsupervised representation learning for time series: A review
Unsupervised representation learning approaches aim to learn discriminative feature
representations from unlabeled data, without the requirement of annotating every sample …
representations from unlabeled data, without the requirement of annotating every sample …