Deep learning in sentiment analysis: Recent architectures

T Abdullah, A Ahmet - ACM Computing Surveys, 2022 - dl.acm.org
Humans are increasingly integrated with devices that enable the collection of vast
unstructured opinionated data. Accurately analysing subjective information from this data is …

UDALM: Unsupervised domain adaptation through language modeling

C Karouzos, G Paraskevopoulos… - arxiv preprint arxiv …, 2021 - arxiv.org
In this work we explore Unsupervised Domain Adaptation (UDA) of pretrained language
models for downstream tasks. We introduce UDALM, a fine-tuning procedure, using a mixed …

Cross-domain data augmentation with domain-adaptive language modeling for aspect-based sentiment analysis

J Yu, Q Zhao, R **a - Proceedings of the 61st Annual Meeting of …, 2023 - aclanthology.org
Abstract Cross-domain Aspect-Based Sentiment Analysis (ABSA) aims to leverage the
useful knowledge from a source domain to identify aspect-sentiment pairs in sentences from …

On the domain adaptation and generalization of pretrained language models: A survey

X Guo, H Yu - arxiv preprint arxiv:2211.03154, 2022 - arxiv.org
Recent advances in NLP are brought by a range of large-scale pretrained language models
(PLMs). These PLMs have brought significant performance gains for a range of NLP tasks …

[PDF][PDF] Cross-domain review generation for aspect-based sentiment analysis

J Yu, C Gong, R **a - Findings of the Association for …, 2021 - aclanthology.org
Supervised learning methods have proven to be effective for Aspect-Based Sentiment
Analysis (ABSA). However, the lack of finegrained labeled data hinders their effectiveness in …

Improving self-training for cross-lingual named entity recognition with contrastive and prototype learning

R Zhou, X Li, L Bing, E Cambria, C Miao - arxiv preprint arxiv:2305.13628, 2023 - arxiv.org
In cross-lingual named entity recognition (NER), self-training is commonly used to bridge the
linguistic gap by training on pseudo-labeled target-language data. However, due to sub …

BERTologiCoMix: How does code-mixing interact with multilingual BERT?

S Santy, A Srinivasan, M Choudhury - Proceedings of the Second …, 2021 - aclanthology.org
Abstract Models such as mBERT and XLMR have shown success in solving Code-Mixed
NLP tasks even though they were not exposed to such text during pretraining. Code-Mixed …

Efficient dynamic feature adaptation for cross language sentiment analysis with biased adversarial training

R Li, C Liu, D Jiang - Knowledge-Based Systems, 2023 - Elsevier
Fine-tuning a large multi-lingual pretrained language model demonstrates impressive
results in cross-language understanding. However, it still suffers when the training and test …

Vibe: Topic-driven temporal adaptation for twitter classification

Y Zhang, J Li, W Li - arxiv preprint arxiv:2310.10191, 2023 - arxiv.org
Language features are evolving in real-world social media, resulting in the deteriorating
performance of text classification in dynamics. To address this challenge, we study temporal …

Domain confused contrastive learning for unsupervised domain adaptation

Q Long, T Luo, W Wang, SJ Pan - arxiv preprint arxiv:2207.04564, 2022 - arxiv.org
In this work, we study Unsupervised Domain Adaptation (UDA) in a challenging self-
supervised approach. One of the difficulties is how to learn task discrimination in the …