Deep learning in sentiment analysis: Recent architectures

T Abdullah, A Ahmet - ACM Computing Surveys, 2022 - dl.acm.org
Humans are increasingly integrated with devices that enable the collection of vast
unstructured opinionated data. Accurately analysing subjective information from this data is …

UDALM: Unsupervised domain adaptation through language modeling

C Karouzos, G Paraskevopoulos… - arxiv preprint arxiv …, 2021 - arxiv.org
In this work we explore Unsupervised Domain Adaptation (UDA) of pretrained language
models for downstream tasks. We introduce UDALM, a fine-tuning procedure, using a mixed …

Cross-domain data augmentation with domain-adaptive language modeling for aspect-based sentiment analysis

J Yu, Q Zhao, R **a - Proceedings of the 61st Annual Meeting of …, 2023 - aclanthology.org
Abstract Cross-domain Aspect-Based Sentiment Analysis (ABSA) aims to leverage the
useful knowledge from a source domain to identify aspect-sentiment pairs in sentences from …

[PDF][PDF] Cross-domain review generation for aspect-based sentiment analysis

J Yu, C Gong, R **a - Findings of the Association for …, 2021 - aclanthology.org
Supervised learning methods have proven to be effective for Aspect-Based Sentiment
Analysis (ABSA). However, the lack of finegrained labeled data hinders their effectiveness in …

Improving self-training for cross-lingual named entity recognition with contrastive and prototype learning

R Zhou, X Li, L Bing, E Cambria, C Miao - arxiv preprint arxiv:2305.13628, 2023 - arxiv.org
In cross-lingual named entity recognition (NER), self-training is commonly used to bridge the
linguistic gap by training on pseudo-labeled target-language data. However, due to sub …

On the domain adaptation and generalization of pretrained language models: A survey

X Guo, H Yu - arxiv preprint arxiv:2211.03154, 2022 - arxiv.org
Recent advances in NLP are brought by a range of large-scale pretrained language models
(PLMs). These PLMs have brought significant performance gains for a range of NLP tasks …

AdaSL: An unsupervised domain adaptation framework for Arabic multi-dialectal sequence labeling

A El Mekki, A El Mahdaouy, I Berrada… - Information Processing & …, 2022 - Elsevier
Dialectal Arabic (DA) refers to varieties of everyday spoken languages in the Arab world.
These dialects differ according to the country and region of the speaker, and their textual …

Adapt in contexts: Retrieval-augmented domain adaptation via in-context learning

Q Long, W Wang, SJ Pan - arxiv preprint arxiv:2311.11551, 2023 - arxiv.org
Large language models (LLMs) have showcased their capability with few-shot inference
known as in-context learning. However, in-domain demonstrations are not always readily …

BERTologiCoMix: How does code-mixing interact with multilingual BERT?

S Santy, A Srinivasan, M Choudhury - Proceedings of the Second …, 2021 - aclanthology.org
Abstract Models such as mBERT and XLMR have shown success in solving Code-Mixed
NLP tasks even though they were not exposed to such text during pretraining. Code-Mixed …

PDALN: Progressive domain adaptation over a pre-trained model for low-resource cross-domain named entity recognition

T Zhang, C **a, PS Yu, Z Liu, S Zhao - EMNLP, 2021 - par.nsf.gov
Abstract Cross-domain Named Entity Recognition (NER) transfers the NER knowledge from
high-resource domains to the low-resource target domain. Due to limited labeled resources …