Deep transfer learning & beyond: Transformer language models in information systems research

R Gruetzemacher, D Paradice - ACM Computing Surveys (CSUR), 2022 - dl.acm.org
AI is widely thought to be poised to transform business, yet current perceptions of the scope
of this transformation may be myopic. Recent progress in natural language processing …

A survey on machine reading comprehension systems

R Baradaran, R Ghiasi, H Amirkhani - Natural Language Engineering, 2022 - cambridge.org
Machine Reading Comprehension (MRC) is a challenging task and hot topic in Natural
Language Processing. The goal of this field is to develop systems for answering the …

Revisiting pre-trained models for Chinese natural language processing

Y Cui, W Che, T Liu, B Qin, S Wang, G Hu - arxiv preprint arxiv …, 2020 - arxiv.org
Bidirectional Encoder Representations from Transformers (BERT) has shown marvelous
improvements across various NLP tasks, and consecutive variants have been proposed to …

[PDF][PDF] XLNet: Generalized Autoregressive Pretraining for Language Understanding

Z Yang - arxiv preprint arxiv:1906.08237, 2019 - fq.pkwyx.com
With the capability of modeling bidirectional contexts, denoising autoencoding based
pretraining like BERT achieves better performance than pretraining approaches based on …

Pre-training with whole word masking for chinese bert

Y Cui, W Che, T Liu, B Qin… - IEEE/ACM Transactions on …, 2021 - ieeexplore.ieee.org
Bidirectional Encoder Representations from Transformers (BERT) has shown marvelous
improvements across various NLP tasks, and its consecutive variants have been proposed …

Semantics-aware BERT for language understanding

Z Zhang, Y Wu, H Zhao, Z Li, S Zhang, X Zhou… - Proceedings of the …, 2020 - ojs.aaai.org
The latest work on language representations carefully integrates contextualized features into
language model training, which enables a series of success especially in various machine …

Cosmos QA: Machine reading comprehension with contextual commonsense reasoning

L Huang, RL Bras, C Bhagavatula, Y Choi - arxiv preprint arxiv …, 2019 - arxiv.org
Understanding narratives requires reading between the lines, which in turn, requires
interpreting the likely causes and effects of events, even when they are not mentioned …

Retrospective reader for machine reading comprehension

Z Zhang, J Yang, H Zhao - Proceedings of the AAAI conference on …, 2021 - ojs.aaai.org
Abstract Machine reading comprehension (MRC) is an AI challenge that requires machines
to determine the correct answers to questions based on a given passage. MRC systems …

SG-Net: Syntax-guided machine reading comprehension

Z Zhang, Y Wu, J Zhou, S Duan, H Zhao… - Proceedings of the AAAI …, 2020 - aaai.org
For machine reading comprehension, the capacity of effectively modeling the linguistic
knowledge from the detail-riddled and lengthy passages and getting ride of the noises is …

Meta-learning approaches for learning-to-learn in deep learning: A survey

Y Tian, X Zhao, W Huang - Neurocomputing, 2022 - Elsevier
Compared to traditional machine learning, deep learning can learn deeper abstract data
representation and understand scattered data properties. It has gained considerable …