Natural language processing: state of the art, current trends and challenges

D Khurana, A Koli, K Khatter, S Singh - Multimedia tools and applications, 2023 - Springer
Natural language processing (NLP) has recently gained much attention for representing and
analyzing human language computationally. It has spread its applications in various fields …

Biomedical question answering: a survey of approaches and challenges

Q **, Z Yuan, G **ong, Q Yu, H Ying, C Tan… - ACM Computing …, 2022 - dl.acm.org
Automatic Question Answering (QA) has been successfully applied in various domains such
as search engines and chatbots. Biomedical QA (BQA), as an emerging QA task, enables …

Merlot reserve: Neural script knowledge through vision and language and sound

R Zellers, J Lu, X Lu, Y Yu, Y Zhao… - Proceedings of the …, 2022 - openaccess.thecvf.com
As humans, we navigate a multimodal world, building a holistic understanding from all our
senses. We introduce MERLOT Reserve, a model that represents videos jointly over time …

Parameter-efficient transfer learning with diff pruning

D Guo, AM Rush, Y Kim - arxiv preprint arxiv:2012.07463, 2020 - arxiv.org
While task-specific finetuning of pretrained networks has led to significant empirical
advances in NLP, the large size of networks makes finetuning difficult to deploy in multi-task …

BioBERT: a pre-trained biomedical language representation model for biomedical text mining

J Lee, W Yoon, S Kim, D Kim, S Kim, CH So… - …, 2020 - academic.oup.com
Motivation Biomedical text mining is becoming increasingly important as the number of
biomedical documents rapidly grows. With the progress in natural language processing …

Transfer learning in natural language processing

S Ruder, ME Peters, S Swayamdipta… - Proceedings of the 2019 …, 2019 - aclanthology.org
The classic supervised machine learning paradigm is based on learning in isolation, a
single predictive model for a task using a single dataset. This approach requires a large …

Open domain question answering using early fusion of knowledge bases and text

H Sun, B Dhingra, M Zaheer, K Mazaitis… - arxiv preprint arxiv …, 2018 - arxiv.org
Open Domain Question Answering (QA) is evolving from complex pipelined systems to end-
to-end deep neural networks. Specialized neural models have been developed for …

A survey on contextual embeddings

Q Liu, MJ Kusner, P Blunsom - arxiv preprint arxiv:2003.07278, 2020 - arxiv.org
Contextual embeddings, such as ELMo and BERT, move beyond global word
representations like Word2Vec and achieve ground-breaking performance on a wide range …

How should pre-trained language models be fine-tuned towards adversarial robustness?

X Dong, AT Luu, M Lin, S Yan… - Advances in Neural …, 2021 - proceedings.neurips.cc
The fine-tuning of pre-trained language models has a great success in many NLP fields. Yet,
it is strikingly vulnerable to adversarial examples, eg, word substitution attacks using only …

Probing biomedical embeddings from language models

Q **, B Dhingra, WW Cohen, X Lu - arxiv preprint arxiv:1904.02181, 2019 - arxiv.org
Contextualized word embeddings derived from pre-trained language models (LMs) show
significant improvements on downstream NLP tasks. Pre-training on domain-specific …