Opportunities and challenges for ChatGPT and large language models in biomedicine and health

S Tian, Q **, L Yeganova, PT Lai, Q Zhu… - Briefings in …, 2024 - academic.oup.com
ChatGPT has drawn considerable attention from both the general public and domain experts
with its remarkable text generation capabilities. This has subsequently led to the emergence …

A survey of knowledge enhanced pre-trained language models

L Hu, Z Liu, Z Zhao, L Hou, L Nie… - IEEE Transactions on …, 2023 - ieeexplore.ieee.org
Pre-trained Language Models (PLMs) which are trained on large text corpus via self-
supervised learning method, have yielded promising performance on various tasks in …

Galactica: A large language model for science

R Taylor, M Kardas, G Cucurull, T Scialom… - arxiv preprint arxiv …, 2022 - arxiv.org
Information overload is a major obstacle to scientific progress. The explosive growth in
scientific literature and data has made it ever harder to discover useful insights in a large …

Domain-specific language model pretraining for biomedical natural language processing

Y Gu, R Tinn, H Cheng, M Lucas, N Usuyama… - ACM Transactions on …, 2021 - dl.acm.org
Pretraining large neural language models, such as BERT, has led to impressive gains on
many natural language processing (NLP) tasks. However, most pretraining efforts focus on …

BioBERT: a pre-trained biomedical language representation model for biomedical text mining

J Lee, W Yoon, S Kim, D Kim, S Kim, CH So… - …, 2020 - academic.oup.com
Motivation Biomedical text mining is becoming increasingly important as the number of
biomedical documents rapidly grows. With the progress in natural language processing …

RadBERT: adapting transformer-based language models to radiology

A Yan, J McAuley, X Lu, J Du, EY Chang… - Radiology: Artificial …, 2022 - pubs.rsna.org
Purpose To investigate if tailoring a transformer-based language model to radiology is
beneficial for radiology natural language processing (NLP) applications. Materials and …

ScispaCy: fast and robust models for biomedical natural language processing

M Neumann, D King, I Beltagy, W Ammar - arxiv preprint arxiv …, 2019 - arxiv.org
Despite recent advances in natural language processing, many statistical models for
processing text perform extremely poorly under domain shift. Processing biomedical and …

ASRNN: A recurrent neural network with an attention model for sequence labeling

JCW Lin, Y Shao, Y Djenouri, U Yun - Knowledge-Based Systems, 2021 - Elsevier
Natural language processing (NLP) is useful for handling text and speech, and sequence
labeling plays an important role by automatically analyzing a sequence (text) to assign …

Pre-trained language models in biomedical domain: A systematic survey

B Wang, Q **e, J Pei, Z Chen, P Tiwari, Z Li… - ACM Computing …, 2023 - dl.acm.org
Pre-trained language models (PLMs) have been the de facto paradigm for most natural
language processing tasks. This also benefits the biomedical domain: researchers from …

Pretrained language models for biomedical and clinical tasks: understanding and extending the state-of-the-art

P Lewis, M Ott, J Du, V Stoyanov - Proceedings of the 3rd clinical …, 2020 - aclanthology.org
A large array of pretrained models are available to the biomedical NLP (BioNLP) community.
Finding the best model for a particular task can be difficult and time-consuming. For many …