Deep transfer learning & beyond: Transformer language models in information systems research
R Gruetzemacher, D Paradice - ACM Computing Surveys (CSUR), 2022 - dl.acm.org
AI is widely thought to be poised to transform business, yet current perceptions of the scope
of this transformation may be myopic. Recent progress in natural language processing …
of this transformation may be myopic. Recent progress in natural language processing …
A survey on machine reading comprehension systems
Machine Reading Comprehension (MRC) is a challenging task and hot topic in Natural
Language Processing. The goal of this field is to develop systems for answering the …
Language Processing. The goal of this field is to develop systems for answering the …
Revisiting pre-trained models for Chinese natural language processing
Bidirectional Encoder Representations from Transformers (BERT) has shown marvelous
improvements across various NLP tasks, and consecutive variants have been proposed to …
improvements across various NLP tasks, and consecutive variants have been proposed to …
[PDF][PDF] XLNet: Generalized Autoregressive Pretraining for Language Understanding
Z Yang - arxiv preprint arxiv:1906.08237, 2019 - fq.pkwyx.com
With the capability of modeling bidirectional contexts, denoising autoencoding based
pretraining like BERT achieves better performance than pretraining approaches based on …
pretraining like BERT achieves better performance than pretraining approaches based on …
Pre-training with whole word masking for chinese bert
Bidirectional Encoder Representations from Transformers (BERT) has shown marvelous
improvements across various NLP tasks, and its consecutive variants have been proposed …
improvements across various NLP tasks, and its consecutive variants have been proposed …
Semantics-aware BERT for language understanding
The latest work on language representations carefully integrates contextualized features into
language model training, which enables a series of success especially in various machine …
language model training, which enables a series of success especially in various machine …
Cosmos QA: Machine reading comprehension with contextual commonsense reasoning
Understanding narratives requires reading between the lines, which in turn, requires
interpreting the likely causes and effects of events, even when they are not mentioned …
interpreting the likely causes and effects of events, even when they are not mentioned …
Retrospective reader for machine reading comprehension
Abstract Machine reading comprehension (MRC) is an AI challenge that requires machines
to determine the correct answers to questions based on a given passage. MRC systems …
to determine the correct answers to questions based on a given passage. MRC systems …
SG-Net: Syntax-guided machine reading comprehension
For machine reading comprehension, the capacity of effectively modeling the linguistic
knowledge from the detail-riddled and lengthy passages and getting ride of the noises is …
knowledge from the detail-riddled and lengthy passages and getting ride of the noises is …
Meta-learning approaches for learning-to-learn in deep learning: A survey
Y Tian, X Zhao, W Huang - Neurocomputing, 2022 - Elsevier
Compared to traditional machine learning, deep learning can learn deeper abstract data
representation and understand scattered data properties. It has gained considerable …
representation and understand scattered data properties. It has gained considerable …