Deep learning in sentiment analysis: Recent architectures
T Abdullah, A Ahmet - ACM Computing Surveys, 2022 - dl.acm.org
Humans are increasingly integrated with devices that enable the collection of vast
unstructured opinionated data. Accurately analysing subjective information from this data is …
unstructured opinionated data. Accurately analysing subjective information from this data is …
UDALM: Unsupervised domain adaptation through language modeling
In this work we explore Unsupervised Domain Adaptation (UDA) of pretrained language
models for downstream tasks. We introduce UDALM, a fine-tuning procedure, using a mixed …
models for downstream tasks. We introduce UDALM, a fine-tuning procedure, using a mixed …
Cross-domain data augmentation with domain-adaptive language modeling for aspect-based sentiment analysis
Abstract Cross-domain Aspect-Based Sentiment Analysis (ABSA) aims to leverage the
useful knowledge from a source domain to identify aspect-sentiment pairs in sentences from …
useful knowledge from a source domain to identify aspect-sentiment pairs in sentences from …
On the domain adaptation and generalization of pretrained language models: A survey
Recent advances in NLP are brought by a range of large-scale pretrained language models
(PLMs). These PLMs have brought significant performance gains for a range of NLP tasks …
(PLMs). These PLMs have brought significant performance gains for a range of NLP tasks …
[PDF][PDF] Cross-domain review generation for aspect-based sentiment analysis
Supervised learning methods have proven to be effective for Aspect-Based Sentiment
Analysis (ABSA). However, the lack of finegrained labeled data hinders their effectiveness in …
Analysis (ABSA). However, the lack of finegrained labeled data hinders their effectiveness in …
Improving self-training for cross-lingual named entity recognition with contrastive and prototype learning
In cross-lingual named entity recognition (NER), self-training is commonly used to bridge the
linguistic gap by training on pseudo-labeled target-language data. However, due to sub …
linguistic gap by training on pseudo-labeled target-language data. However, due to sub …
BERTologiCoMix: How does code-mixing interact with multilingual BERT?
Abstract Models such as mBERT and XLMR have shown success in solving Code-Mixed
NLP tasks even though they were not exposed to such text during pretraining. Code-Mixed …
NLP tasks even though they were not exposed to such text during pretraining. Code-Mixed …
Efficient dynamic feature adaptation for cross language sentiment analysis with biased adversarial training
Fine-tuning a large multi-lingual pretrained language model demonstrates impressive
results in cross-language understanding. However, it still suffers when the training and test …
results in cross-language understanding. However, it still suffers when the training and test …
Vibe: Topic-driven temporal adaptation for twitter classification
Language features are evolving in real-world social media, resulting in the deteriorating
performance of text classification in dynamics. To address this challenge, we study temporal …
performance of text classification in dynamics. To address this challenge, we study temporal …
Domain confused contrastive learning for unsupervised domain adaptation
In this work, we study Unsupervised Domain Adaptation (UDA) in a challenging self-
supervised approach. One of the difficulties is how to learn task discrimination in the …
supervised approach. One of the difficulties is how to learn task discrimination in the …