BERTweet: A pre-trained language model for English Tweets
We present BERTweet, the first public large-scale pre-trained language model for English
Tweets. Our BERTweet, having the same architecture as BERT-base (Devlin et al., 2019), is …
Tweets. Our BERTweet, having the same architecture as BERT-base (Devlin et al., 2019), is …
Automated concatenation of embeddings for structured prediction
Pretrained contextualized embeddings are powerful word representations for structured
prediction tasks. Recent work found that better word representations can be obtained by …
prediction tasks. Recent work found that better word representations can be obtained by …
Accelerating bert inference for sequence labeling via early-exit
Both performance and efficiency are crucial factors for sequence labeling tasks in many real-
world scenarios. Although the pre-trained models (PTMs) have significantly improved the …
world scenarios. Although the pre-trained models (PTMs) have significantly improved the …
Question Calibration and Multi-Hop Modeling for Temporal Question Answering
Many models that leverage knowledge graphs (KGs) have recently demonstrated
remarkable success in question answering (QA) tasks. In the real world, many facts …
remarkable success in question answering (QA) tasks. In the real world, many facts …
On the hidden negative transfer in sequential transfer learning for domain adaptation from news to tweets
Transfer Learning has been shown to be a powerful tool for Natural Language Processing
(NLP) and has outperformed the standard supervised learning paradigm, as it takes benefit …
(NLP) and has outperformed the standard supervised learning paradigm, as it takes benefit …
Comateformer: Combined Attention Transformer for Semantic Sentence Matching
The Transformer-based model have made significant strides in semantic matching tasks by
capturing connections between phrase pairs. However, to assess the relevance of sentence …
capturing connections between phrase pairs. However, to assess the relevance of sentence …
Data annealing for informal language understanding tasks
There is a huge performance gap between formal and informal language understanding
tasks. The recent pre-trained models that improved the performance of formal language …
tasks. The recent pre-trained models that improved the performance of formal language …
Cross-Genre Retrieval for Information Integrity: A COVID-19 Case Study
Ubiquitous communication on social media has led to a rapid increase in the proliferation of
unreliable information. Its ill-effects have perhaps been seen most obviously during the …
unreliable information. Its ill-effects have perhaps been seen most obviously during the …
Psychophysiology-aided Perceptually Fluent Speech Analysis of Children Who Stutter
This first-of-its-kind paper presents a novel approach named PASAD that detects changes in
perceptually fluent speech acoustics of young children. Particularly, analysis of perceptually …
perceptually fluent speech acoustics of young children. Particularly, analysis of perceptually …
[PDF][PDF] Crude Oil Forecasting via Events and Outlook Extraction from Commodity News
M Lee - 2022 - scholar.archive.org
Natural language-based financial forecasting is an active area of research, with the majority
of publications centering around company stock prediction. There is, however, a huge …
of publications centering around company stock prediction. There is, however, a huge …