Neural natural language processing for unstructured data in electronic health records: a review

I Li, J Pan, J Goldwasser, N Verma, WP Wong… - Computer Science …, 2022 - Elsevier
Electronic health records (EHRs), digital collections of patient healthcare events and
observations, are ubiquitous in medicine and critical to healthcare delivery, operations, and …

Transformer models used for text-based question answering systems

K Nassiri, M Akhloufi - Applied Intelligence, 2023 - Springer
The question answering system is frequently applied in the area of natural language
processing (NLP) because of the wide variety of applications. It consists of answering …

Improving clip training with language rewrites

L Fan, D Krishnan, P Isola… - Advances in Neural …, 2023 - proceedings.neurips.cc
Abstract Contrastive Language-Image Pre-training (CLIP) stands as one of the most effective
and scalable methods for training transferable vision models using paired image and text …

On the effectiveness of adapter-based tuning for pretrained language model adaptation

R He, L Liu, H Ye, Q Tan, B Ding, L Cheng… - arxiv preprint arxiv …, 2021 - arxiv.org
Adapter-based tuning has recently arisen as an alternative to fine-tuning. It works by adding
light-weight adapter modules to a pretrained language model (PrLM) and only updating the …

Universal language model fine-tuning for text classification

J Howard, S Ruder - arxiv preprint arxiv:1801.06146, 2018 - arxiv.org
Inductive transfer learning has greatly impacted computer vision, but existing approaches in
NLP still require task-specific modifications and training from scratch. We propose Universal …

To tune or not to tune? adapting pretrained representations to diverse tasks

ME Peters, S Ruder, NA Smith - arxiv preprint arxiv:1903.05987, 2019 - arxiv.org
While most previous work has focused on different pretraining objectives and architectures
for transfer learning, we ask how to best adapt the pretrained model to a given target task …

Supervised learning of universal sentence representations from natural language inference data

A Conneau, D Kiela, H Schwenk, L Barrault… - arxiv preprint arxiv …, 2017 - arxiv.org
Many modern NLP systems rely on word embeddings, previously trained in an unsupervised
manner on large corpora, as base features. Efforts to obtain embeddings for larger chunks of …

A broad-coverage challenge corpus for sentence understanding through inference

A Williams, N Nangia, SR Bowman - arxiv preprint arxiv:1704.05426, 2017 - arxiv.org
This paper introduces the Multi-Genre Natural Language Inference (MultiNLI) corpus, a
dataset designed for use in the development and evaluation of machine learning models for …

Learning protein sequence embeddings using information from structure

T Bepler, B Berger - arxiv preprint arxiv:1902.08661, 2019 - arxiv.org
Inferring the structural properties of a protein from its amino acid sequence is a challenging
yet important problem in biology. Structures are not known for the vast majority of protein …

Recall and learn: Fine-tuning deep pretrained language models with less forgetting

S Chen, Y Hou, Y Cui, W Che, T Liu, X Yu - arxiv preprint arxiv …, 2020 - arxiv.org
Deep pretrained language models have achieved great success in the way of pretraining
first and then fine-tuning. But such a sequential transfer learning paradigm often confronts …