Pre-trained language models in biomedical domain: A systematic survey
Pre-trained language models (PLMs) have been the de facto paradigm for most natural
language processing tasks. This also benefits the biomedical domain: researchers from …
language processing tasks. This also benefits the biomedical domain: researchers from …
[HTML][HTML] AMMU: a survey of transformer-based biomedical pretrained language models
Transformer-based pretrained language models (PLMs) have started a new era in modern
natural language processing (NLP). These models combine the power of transformers …
natural language processing (NLP). These models combine the power of transformers …
[HTML][HTML] A survey of large language models for healthcare: from data, technology, and applications to accountability and ethics
The utilization of large language models (LLMs) for Healthcare has generated both
excitement and concern due to their ability to effectively respond to free-text queries with …
excitement and concern due to their ability to effectively respond to free-text queries with …
Self-alignment pretraining for biomedical entity representations
Despite the widespread success of self-supervised learning via masked language models
(MLM), accurately capturing fine-grained semantic relationships in the biomedical domain …
(MLM), accurately capturing fine-grained semantic relationships in the biomedical domain …
BioBART: Pretraining and evaluation of a biomedical generative language model
Pretrained language models have served as important backbones for natural language
processing. Recently, in-domain pretraining has been shown to benefit various domain …
processing. Recently, in-domain pretraining has been shown to benefit various domain …
A survey on clinical natural language processing in the United Kingdom from 2007 to 2022
Much of the knowledge and information needed for enabling high-quality clinical research is
stored in free-text format. Natural language processing (NLP) has been used to extract …
stored in free-text format. Natural language processing (NLP) has been used to extract …
[HTML][HTML] A comprehensive evaluation of large language models on benchmark biomedical text processing tasks
Abstract Recently, Large Language Models (LLMs) have demonstrated impressive
capability to solve a wide range of tasks. However, despite their success across various …
capability to solve a wide range of tasks. However, despite their success across various …
Fast, effective, and self-supervised: Transforming masked language models into universal lexical and sentence encoders
Pretrained Masked Language Models (MLMs) have revolutionised NLP in recent years.
However, previous work has indicated that off-the-shelf MLMs are not effective as universal …
However, previous work has indicated that off-the-shelf MLMs are not effective as universal …
[HTML][HTML] Does the magic of BERT apply to medical code assignment? A quantitative study
Unsupervised pretraining is an integral part of many natural language processing systems,
and transfer learning with language models has achieved remarkable results in downstream …
and transfer learning with language models has achieved remarkable results in downstream …
A comprehensive survey on evaluating large language model applications in the medical industry
Y Huang, K Tang, M Chen, B Wang - arxiv preprint arxiv:2404.15777, 2024 - arxiv.org
Since the inception of the Transformer architecture in 2017, Large Language Models (LLMs)
such as GPT and BERT have evolved significantly, impacting various industries with their …
such as GPT and BERT have evolved significantly, impacting various industries with their …