Ammus: A survey of transformer-based pretrained models in natural language processing
KS Kalyan, A Rajasekharan, S Sangeetha - arxiv preprint arxiv …, 2021 - arxiv.org
Transformer-based pretrained language models (T-PTLMs) have achieved great success in
almost every NLP task. The evolution of these models started with GPT and BERT. These …
almost every NLP task. The evolution of these models started with GPT and BERT. These …
Pre-trained language models in biomedical domain: A systematic survey
Pre-trained language models (PLMs) have been the de facto paradigm for most natural
language processing tasks. This also benefits the biomedical domain: researchers from …
language processing tasks. This also benefits the biomedical domain: researchers from …
Neural natural language processing for unstructured data in electronic health records: a review
Electronic health records (EHRs), digital collections of patient healthcare events and
observations, are ubiquitous in medicine and critical to healthcare delivery, operations, and …
observations, are ubiquitous in medicine and critical to healthcare delivery, operations, and …
[HTML][HTML] AMMU: a survey of transformer-based biomedical pretrained language models
Transformer-based pretrained language models (PLMs) have started a new era in modern
natural language processing (NLP). These models combine the power of transformers …
natural language processing (NLP). These models combine the power of transformers …
Biomedical question answering: a survey of approaches and challenges
Automatic Question Answering (QA) has been successfully applied in various domains such
as search engines and chatbots. Biomedical QA (BQA), as an emerging QA task, enables …
as search engines and chatbots. Biomedical QA (BQA), as an emerging QA task, enables …
Uncertainty quantification with pre-trained language models: A large-scale empirical analysis
Pre-trained language models (PLMs) have gained increasing popularity due to their
compelling prediction performance in diverse natural language processing (NLP) tasks …
compelling prediction performance in diverse natural language processing (NLP) tasks …
Artificial intelligence foundation and pre-trained models: Fundamentals, applications, opportunities, and social impacts
A Kolides, A Nawaz, A Rathor, D Beeman… - … Modelling Practice and …, 2023 - Elsevier
With the emergence of foundation models (FMs) that are trained on large amounts of data at
scale and adaptable to a wide range of downstream applications, AI is experiencing a …
scale and adaptable to a wide range of downstream applications, AI is experiencing a …
Paper Plain: Making Medical Research Papers Approachable to Healthcare Consumers with Natural Language Processing
When seeking information not covered in patient-friendly documents, healthcare consumers
may turn to the research literature. Reading medical papers, however, can be a challenging …
may turn to the research literature. Reading medical papers, however, can be a challenging …
Knowledgeable preference alignment for llms in domain-specific question answering
Deploying large language models (LLMs) to real scenarios for domain-specific question
answering (QA) is a key thrust for LLM applications, which poses numerous challenges …
answering (QA) is a key thrust for LLM applications, which poses numerous challenges …
Transformers in healthcare: A survey
With Artificial Intelligence (AI) increasingly permeating various aspects of society, including
healthcare, the adoption of the Transformers neural network architecture is rapidly changing …
healthcare, the adoption of the Transformers neural network architecture is rapidly changing …