Pre-trained language models in biomedical domain: A systematic survey
Pre-trained language models (PLMs) have been the de facto paradigm for most natural
language processing tasks. This also benefits the biomedical domain: researchers from …
language processing tasks. This also benefits the biomedical domain: researchers from …
Neural natural language processing for unstructured data in electronic health records: a review
Electronic health records (EHRs), digital collections of patient healthcare events and
observations, are ubiquitous in medicine and critical to healthcare delivery, operations, and …
observations, are ubiquitous in medicine and critical to healthcare delivery, operations, and …
Publicly available clinical BERT embeddings
Contextual word embedding models such as ELMo (Peters et al., 2018) and BERT (Devlin et
al., 2018) have dramatically improved performance for many natural language processing …
al., 2018) have dramatically improved performance for many natural language processing …
BioBERT: a pre-trained biomedical language representation model for biomedical text mining
Motivation Biomedical text mining is becoming increasingly important as the number of
biomedical documents rapidly grows. With the progress in natural language processing …
biomedical documents rapidly grows. With the progress in natural language processing …
Pretrained language models for biomedical and clinical tasks: understanding and extending the state-of-the-art
A large array of pretrained models are available to the biomedical NLP (BioNLP) community.
Finding the best model for a particular task can be difficult and time-consuming. For many …
Finding the best model for a particular task can be difficult and time-consuming. For many …
Enhancing clinical concept extraction with contextual embeddings
Objective Neural network–based representations (“embeddings”) have dramatically
advanced natural language processing (NLP) tasks, including clinical NLP tasks such as …
advanced natural language processing (NLP) tasks, including clinical NLP tasks such as …
Scifive: a text-to-text transformer model for biomedical literature
In this report, we introduce SciFive, a domain-specific T5 model that has been pre-trained on
large biomedical corpora. Our model outperforms the current SOTA methods (ie BERT …
large biomedical corpora. Our model outperforms the current SOTA methods (ie BERT …
Deep learning in clinical natural language processing: a methodical review
Objective This article methodically reviews the literature on deep learning (DL) for natural
language processing (NLP) in the clinical domain, providing quantitative analysis to answer …
language processing (NLP) in the clinical domain, providing quantitative analysis to answer …
[HTML][HTML] A survey of word embeddings for clinical text
Representing words as numerical vectors based on the contexts in which they appear has
become the de facto method of analyzing text with machine learning. In this paper, we …
become the de facto method of analyzing text with machine learning. In this paper, we …
Umlsbert: Clinical domain knowledge augmentation of contextual embeddings using the unified medical language system metathesaurus
Contextual word embedding models, such as BioBERT and Bio_ClinicalBERT, have
achieved state-of-the-art results in biomedical natural language processing tasks by …
achieved state-of-the-art results in biomedical natural language processing tasks by …