Opportunities and challenges for ChatGPT and large language models in biomedicine and health
ChatGPT has drawn considerable attention from both the general public and domain experts
with its remarkable text generation capabilities. This has subsequently led to the emergence …
with its remarkable text generation capabilities. This has subsequently led to the emergence …
A survey of knowledge enhanced pre-trained language models
Pre-trained Language Models (PLMs) which are trained on large text corpus via self-
supervised learning method, have yielded promising performance on various tasks in …
supervised learning method, have yielded promising performance on various tasks in …
Galactica: A large language model for science
Information overload is a major obstacle to scientific progress. The explosive growth in
scientific literature and data has made it ever harder to discover useful insights in a large …
scientific literature and data has made it ever harder to discover useful insights in a large …
Domain-specific language model pretraining for biomedical natural language processing
Pretraining large neural language models, such as BERT, has led to impressive gains on
many natural language processing (NLP) tasks. However, most pretraining efforts focus on …
many natural language processing (NLP) tasks. However, most pretraining efforts focus on …
BioBERT: a pre-trained biomedical language representation model for biomedical text mining
Motivation Biomedical text mining is becoming increasingly important as the number of
biomedical documents rapidly grows. With the progress in natural language processing …
biomedical documents rapidly grows. With the progress in natural language processing …
RadBERT: adapting transformer-based language models to radiology
Purpose To investigate if tailoring a transformer-based language model to radiology is
beneficial for radiology natural language processing (NLP) applications. Materials and …
beneficial for radiology natural language processing (NLP) applications. Materials and …
ScispaCy: fast and robust models for biomedical natural language processing
Despite recent advances in natural language processing, many statistical models for
processing text perform extremely poorly under domain shift. Processing biomedical and …
processing text perform extremely poorly under domain shift. Processing biomedical and …
ASRNN: A recurrent neural network with an attention model for sequence labeling
Natural language processing (NLP) is useful for handling text and speech, and sequence
labeling plays an important role by automatically analyzing a sequence (text) to assign …
labeling plays an important role by automatically analyzing a sequence (text) to assign …
Pre-trained language models in biomedical domain: A systematic survey
Pre-trained language models (PLMs) have been the de facto paradigm for most natural
language processing tasks. This also benefits the biomedical domain: researchers from …
language processing tasks. This also benefits the biomedical domain: researchers from …
Pretrained language models for biomedical and clinical tasks: understanding and extending the state-of-the-art
A large array of pretrained models are available to the biomedical NLP (BioNLP) community.
Finding the best model for a particular task can be difficult and time-consuming. For many …
Finding the best model for a particular task can be difficult and time-consuming. For many …