Opportunities and challenges for ChatGPT and large language models in biomedicine and health

S Tian, Q **, L Yeganova, PT Lai, Q Zhu… - Briefings in …, 2024‏ - academic.oup.com
ChatGPT has drawn considerable attention from both the general public and domain experts
with its remarkable text generation capabilities. This has subsequently led to the emergence …

A survey of knowledge enhanced pre-trained language models

L Hu, Z Liu, Z Zhao, L Hou, L Nie… - IEEE Transactions on …, 2023‏ - ieeexplore.ieee.org
Pre-trained Language Models (PLMs) which are trained on large text corpus via self-
supervised learning method, have yielded promising performance on various tasks in …

[PDF][PDF] Galactica: A large language model for science

R Taylor, M Kardas, G Cucurull, T Scialom… - arxiv preprint arxiv …, 2022‏ - galactica.org
Abstract Information overload is a major obstacle to scientific progress. The explosive growth
in scientific literature and data has made it ever harder to discover useful insights in a large …

Domain-specific language model pretraining for biomedical natural language processing

Y Gu, R Tinn, H Cheng, M Lucas, N Usuyama… - ACM Transactions on …, 2021‏ - dl.acm.org
Pretraining large neural language models, such as BERT, has led to impressive gains on
many natural language processing (NLP) tasks. However, most pretraining efforts focus on …

An extensive benchmark study on biomedical text generation and mining with ChatGPT

Q Chen, H Sun, H Liu, Y Jiang, T Ran, X **… - …, 2023‏ - academic.oup.com
Motivation In recent years, the development of natural language process (NLP) technologies
and deep learning hardware has led to significant improvement in large language models …

RadBERT: adapting transformer-based language models to radiology

A Yan, J McAuley, X Lu, J Du, EY Chang… - Radiology: Artificial …, 2022‏ - pubs.rsna.org
Purpose To investigate if tailoring a transformer-based language model to radiology is
beneficial for radiology natural language processing (NLP) applications. Materials and …

Pre-trained language models in biomedical domain: A systematic survey

B Wang, Q **e, J Pei, Z Chen, P Tiwari, Z Li… - ACM Computing …, 2023‏ - dl.acm.org
Pre-trained language models (PLMs) have been the de facto paradigm for most natural
language processing tasks. This also benefits the biomedical domain: researchers from …

BioBERT: a pre-trained biomedical language representation model for biomedical text mining

J Lee, W Yoon, S Kim, D Kim, S Kim, CH So… - …, 2020‏ - academic.oup.com
Motivation Biomedical text mining is becoming increasingly important as the number of
biomedical documents rapidly grows. With the progress in natural language processing …

Thinking about gpt-3 in-context learning for biomedical ie? think again

BJ Gutierrez, N McNeal, C Washington, Y Chen… - arxiv preprint arxiv …, 2022‏ - arxiv.org
The strong few-shot in-context learning capability of large pre-trained language models
(PLMs) such as GPT-3 is highly appealing for application domains such as biomedicine …

Domain adaptation: challenges, methods, datasets, and applications

P Singhal, R Walambe, S Ramanna, K Kotecha - IEEE access, 2023‏ - ieeexplore.ieee.org
Deep Neural Networks (DNNs) trained on one dataset (source domain) do not perform well
on another set of data (target domain), which is different but has similar properties as the …