BERTweet: A pre-trained language model for English Tweets

DQ Nguyen, T Vu, AT Nguyen - arxiv preprint arxiv:2005.10200, 2020 - arxiv.org
We present BERTweet, the first public large-scale pre-trained language model for English
Tweets. Our BERTweet, having the same architecture as BERT-base (Devlin et al., 2019), is …

Automated concatenation of embeddings for structured prediction

X Wang, Y Jiang, N Bach, T Wang, Z Huang… - arxiv preprint arxiv …, 2020 - arxiv.org
Pretrained contextualized embeddings are powerful word representations for structured
prediction tasks. Recent work found that better word representations can be obtained by …

Accelerating bert inference for sequence labeling via early-exit

X Li, Y Shao, T Sun, H Yan, X Qiu, X Huang - arxiv preprint arxiv …, 2021 - arxiv.org
Both performance and efficiency are crucial factors for sequence labeling tasks in many real-
world scenarios. Although the pre-trained models (PTMs) have significantly improved the …

Question Calibration and Multi-Hop Modeling for Temporal Question Answering

C Xue, D Liang, P Wang, J Zhang - … of the AAAI Conference on Artificial …, 2024 - ojs.aaai.org
Many models that leverage knowledge graphs (KGs) have recently demonstrated
remarkable success in question answering (QA) tasks. In the real world, many facts …

On the hidden negative transfer in sequential transfer learning for domain adaptation from news to tweets

S Meftah, N Semmar, Y Tamaazousti… - … Second Workshop on …, 2021 - cea.hal.science
Transfer Learning has been shown to be a powerful tool for Natural Language Processing
(NLP) and has outperformed the standard supervised learning paradigm, as it takes benefit …

Comateformer: Combined Attention Transformer for Semantic Sentence Matching

B Li, D Liang, Z Zhang - arxiv preprint arxiv:2412.07220, 2024 - arxiv.org
The Transformer-based model have made significant strides in semantic matching tasks by
capturing connections between phrase pairs. However, to assess the relevance of sentence …

Data annealing for informal language understanding tasks

J Gu, Z Yu - arxiv preprint arxiv:2004.13833, 2020 - arxiv.org
There is a huge performance gap between formal and informal language understanding
tasks. The recent pre-trained models that improved the performance of formal language …

Cross-Genre Retrieval for Information Integrity: A COVID-19 Case Study

C Zuo, C Wang, R Banerjee - … Conference on Advanced Data Mining and …, 2023 - Springer
Ubiquitous communication on social media has led to a rapid increase in the proliferation of
unreliable information. Its ill-effects have perhaps been seen most obviously during the …

Psychophysiology-aided Perceptually Fluent Speech Analysis of Children Who Stutter

Y **ao, H Sharma, V Tumanova, A Salekin - arxiv preprint arxiv …, 2022 - arxiv.org
This first-of-its-kind paper presents a novel approach named PASAD that detects changes in
perceptually fluent speech acoustics of young children. Particularly, analysis of perceptually …

[PDF][PDF] Crude Oil Forecasting via Events and Outlook Extraction from Commodity News

M Lee - 2022 - scholar.archive.org
Natural language-based financial forecasting is an active area of research, with the majority
of publications centering around company stock prediction. There is, however, a huge …