State of the art: a review of sentiment analysis based on sequential transfer learning
Recently, sequential transfer learning emerged as a modern technique for applying the
“pretrain then fine-tune” paradigm to leverage existing knowledge to improve the …
“pretrain then fine-tune” paradigm to leverage existing knowledge to improve the …
A survey on aspect-based sentiment analysis: Tasks, methods, and challenges
As an important fine-grained sentiment analysis problem, aspect-based sentiment analysis
(ABSA), aiming to analyze and understand people's opinions at the aspect level, has been …
(ABSA), aiming to analyze and understand people's opinions at the aspect level, has been …
Exploring aspect-based sentiment quadruple extraction with implicit aspects, opinions, and ChatGPT: a comprehensive survey
In contrast to earlier ABSA studies primarily concentrating on individual sentiment
components, recent research has ventured into more complex ABSA tasks encompassing …
components, recent research has ventured into more complex ABSA tasks encompassing …
Self-supervised contrastive learning for code retrieval and summarization via semantic-preserving transformations
We propose Corder, a self-supervised contrastive learning framework for source code
model. Corder is designed to alleviate the need of labeled data for code retrieval and code …
model. Corder is designed to alleviate the need of labeled data for code retrieval and code …
Heterogeneous contrastive learning for foundation models and beyond
In the era of big data and Artificial Intelligence, an emerging paradigm is to utilize contrastive
self-supervised learning to model large-scale heterogeneous data. Many existing foundation …
self-supervised learning to model large-scale heterogeneous data. Many existing foundation …
Making pre-trained language models end-to-end few-shot learners with contrastive prompt tuning
Pre-trained Language Models (PLMs) have achieved remarkable performance for various
language understanding tasks in IR systems, which require the fine-tuning process based …
language understanding tasks in IR systems, which require the fine-tuning process based …
Confidence-aware sentiment quantification via sentiment perturbation modeling
Sentiment Quantification aims to detect the overall sentiment polarity of users from a set of
reviews corresponding to a target. Existing methods equally treat and aggregate individual …
reviews corresponding to a target. Existing methods equally treat and aggregate individual …
DC-SiamNet: Deep contrastive Siamese network for self-supervised MRI reconstruction
Y Yan, T Yang, X Zhao, C Jiao, A Yang… - Computers in Biology and …, 2023 - Elsevier
Reconstruction methods based on deep learning have greatly shortened the data
acquisition time of magnetic resonance imaging (MRI). However, these methods typically …
acquisition time of magnetic resonance imaging (MRI). However, these methods typically …
Multi-task self-supervised time-series representation learning
Time-series representation learning is crucial for extracting meaningful representations from
time-series data with temporal dynamics and sparse labels. Contrastive learning, a powerful …
time-series data with temporal dynamics and sparse labels. Contrastive learning, a powerful …
Supervised contrastive learning with hard negative samples
Through minimization of an appropriate loss function such as the InfoNCE loss, contrastive
learning (CL) learns a useful representation function by pulling positive samples close to …
learning (CL) learns a useful representation function by pulling positive samples close to …