State of the art: a review of sentiment analysis based on sequential transfer learning

JYL Chan, KT Bea, SMH Leow, SW Phoong… - Artificial Intelligence …, 2023 - Springer
Recently, sequential transfer learning emerged as a modern technique for applying the
“pretrain then fine-tune” paradigm to leverage existing knowledge to improve the …

A survey on aspect-based sentiment analysis: Tasks, methods, and challenges

W Zhang, X Li, Y Deng, L Bing… - IEEE Transactions on …, 2022 - ieeexplore.ieee.org
As an important fine-grained sentiment analysis problem, aspect-based sentiment analysis
(ABSA), aiming to analyze and understand people's opinions at the aspect level, has been …

Exploring aspect-based sentiment quadruple extraction with implicit aspects, opinions, and ChatGPT: a comprehensive survey

H Zhang, YN Cheah, OM Alyasiri, J An - Artificial Intelligence Review, 2024 - Springer
In contrast to earlier ABSA studies primarily concentrating on individual sentiment
components, recent research has ventured into more complex ABSA tasks encompassing …

Self-supervised contrastive learning for code retrieval and summarization via semantic-preserving transformations

NDQ Bui, Y Yu, L Jiang - Proceedings of the 44th International ACM …, 2021 - dl.acm.org
We propose Corder, a self-supervised contrastive learning framework for source code
model. Corder is designed to alleviate the need of labeled data for code retrieval and code …

Heterogeneous contrastive learning for foundation models and beyond

L Zheng, B **g, Z Li, H Tong, J He - Proceedings of the 30th ACM …, 2024 - dl.acm.org
In the era of big data and Artificial Intelligence, an emerging paradigm is to utilize contrastive
self-supervised learning to model large-scale heterogeneous data. Many existing foundation …

Making pre-trained language models end-to-end few-shot learners with contrastive prompt tuning

Z Xu, C Wang, M Qiu, F Luo, R Xu, S Huang… - Proceedings of the …, 2023 - dl.acm.org
Pre-trained Language Models (PLMs) have achieved remarkable performance for various
language understanding tasks in IR systems, which require the fine-tuning process based …

Confidence-aware sentiment quantification via sentiment perturbation modeling

X Tang, D Liao, M Shen, L Zhu, S Huang… - IEEE Transactions …, 2023 - ieeexplore.ieee.org
Sentiment Quantification aims to detect the overall sentiment polarity of users from a set of
reviews corresponding to a target. Existing methods equally treat and aggregate individual …

DC-SiamNet: Deep contrastive Siamese network for self-supervised MRI reconstruction

Y Yan, T Yang, X Zhao, C Jiao, A Yang… - Computers in Biology and …, 2023 - Elsevier
Reconstruction methods based on deep learning have greatly shortened the data
acquisition time of magnetic resonance imaging (MRI). However, these methods typically …

Multi-task self-supervised time-series representation learning

H Choi, P Kang - Information Sciences, 2024 - Elsevier
Time-series representation learning is crucial for extracting meaningful representations from
time-series data with temporal dynamics and sparse labels. Contrastive learning, a powerful …

Supervised contrastive learning with hard negative samples

R Jiang, T Nguyen, P Ishwar… - 2024 International Joint …, 2024 - ieeexplore.ieee.org
Through minimization of an appropriate loss function such as the InfoNCE loss, contrastive
learning (CL) learns a useful representation function by pulling positive samples close to …