Natural language processing: state of the art, current trends and challenges
Natural language processing (NLP) has recently gained much attention for representing and
analyzing human language computationally. It has spread its applications in various fields …
analyzing human language computationally. It has spread its applications in various fields …
Biomedical question answering: a survey of approaches and challenges
Automatic Question Answering (QA) has been successfully applied in various domains such
as search engines and chatbots. Biomedical QA (BQA), as an emerging QA task, enables …
as search engines and chatbots. Biomedical QA (BQA), as an emerging QA task, enables …
Merlot reserve: Neural script knowledge through vision and language and sound
As humans, we navigate a multimodal world, building a holistic understanding from all our
senses. We introduce MERLOT Reserve, a model that represents videos jointly over time …
senses. We introduce MERLOT Reserve, a model that represents videos jointly over time …
Parameter-efficient transfer learning with diff pruning
While task-specific finetuning of pretrained networks has led to significant empirical
advances in NLP, the large size of networks makes finetuning difficult to deploy in multi-task …
advances in NLP, the large size of networks makes finetuning difficult to deploy in multi-task …
BioBERT: a pre-trained biomedical language representation model for biomedical text mining
Motivation Biomedical text mining is becoming increasingly important as the number of
biomedical documents rapidly grows. With the progress in natural language processing …
biomedical documents rapidly grows. With the progress in natural language processing …
Transfer learning in natural language processing
The classic supervised machine learning paradigm is based on learning in isolation, a
single predictive model for a task using a single dataset. This approach requires a large …
single predictive model for a task using a single dataset. This approach requires a large …
Open domain question answering using early fusion of knowledge bases and text
Open Domain Question Answering (QA) is evolving from complex pipelined systems to end-
to-end deep neural networks. Specialized neural models have been developed for …
to-end deep neural networks. Specialized neural models have been developed for …
A survey on contextual embeddings
Contextual embeddings, such as ELMo and BERT, move beyond global word
representations like Word2Vec and achieve ground-breaking performance on a wide range …
representations like Word2Vec and achieve ground-breaking performance on a wide range …
How should pre-trained language models be fine-tuned towards adversarial robustness?
The fine-tuning of pre-trained language models has a great success in many NLP fields. Yet,
it is strikingly vulnerable to adversarial examples, eg, word substitution attacks using only …
it is strikingly vulnerable to adversarial examples, eg, word substitution attacks using only …
Probing biomedical embeddings from language models
Contextualized word embeddings derived from pre-trained language models (LMs) show
significant improvements on downstream NLP tasks. Pre-training on domain-specific …
significant improvements on downstream NLP tasks. Pre-training on domain-specific …