Gpt (generative pre-trained transformer)–a comprehensive review on enabling technologies, potential applications, emerging challenges, and future directions

G Yenduri, M Ramalingam, GC Selvi, Y Supriya… - IEEE …, 2024 - ieeexplore.ieee.org
The Generative Pre-trained Transformer (GPT) represents a notable breakthrough in the
domain of natural language processing, which is propelling us toward the development of …

Large-scale multi-modal pre-trained models: A comprehensive survey

X Wang, G Chen, G Qian, P Gao, XY Wei… - Machine Intelligence …, 2023 - Springer
With the urgent demand for generalized deep models, many pre-trained big models are
proposed, such as bidirectional encoder representations (BERT), vision transformer (ViT) …

Efficient large language models: A survey

Z Wan, X Wang, C Liu, S Alam, Y Zheng, J Liu… - arxiv preprint arxiv …, 2023 - arxiv.org
Large Language Models (LLMs) have demonstrated remarkable capabilities in important
tasks such as natural language understanding and language generation, and thus have the …

Huatuogpt, towards taming language model to be a doctor

H Zhang, J Chen, F Jiang, F Yu, Z Chen, J Li… - arxiv preprint arxiv …, 2023 - arxiv.org
In this paper, we present HuatuoGPT, a large language model (LLM) for medical
consultation. The core recipe of HuatuoGPT is to leverage both\textit {distilled data from …

Leveraging generative AI and large Language models: a Comprehensive Roadmap for Healthcare Integration

P Yu, H Xu, X Hu, C Deng - Healthcare, 2023 - mdpi.com
Generative artificial intelligence (AI) and large language models (LLMs), exemplified by
ChatGPT, are promising for revolutionizing data and information management in healthcare …

Covid-transformer: Interpretable covid-19 detection using vision transformer for healthcare

D Shome, T Kar, SN Mohanty, P Tiwari… - International Journal of …, 2021 - mdpi.com
In the recent pandemic, accurate and rapid testing of patients remained a critical task in the
diagnosis and control of COVID-19 disease spread in the healthcare industry. Because of …

Fairness in large language models: A taxonomic survey

Z Chu, Z Wang, W Zhang - ACM SIGKDD explorations newsletter, 2024 - dl.acm.org
Large Language Models (LLMs) have demonstrated remarkable success across various
domains. However, despite their promising performance in numerous real-world …

K2: A foundation language model for geoscience knowledge understanding and utilization

C Deng, T Zhang, Z He, Q Chen, Y Shi, Y Xu… - Proceedings of the 17th …, 2024 - dl.acm.org
Large language models (LLMs) have achieved great success in general domains of natural
language processing. In this paper, we bring LLMs to the realm of geoscience with the …

[HTML][HTML] Pre-trained language models with domain knowledge for biomedical extractive summarization

Q **e, JA Bishop, P Tiwari, S Ananiadou - Knowledge-Based Systems, 2022 - Elsevier
Biomedical text summarization is a critical task for comprehension of an ever-growing
amount of biomedical literature. Pre-trained language models (PLMs) with transformer …

Huatuogpt-ii, one-stage training for medical adaption of llms

J Chen, X Wang, K Ji, A Gao, F Jiang, S Chen… - arxiv preprint arxiv …, 2023 - arxiv.org
Adapting a language model into a specific domain, akadomain adaption', is a common
practice when specialized knowledge, eg medicine, is not encapsulated in a general …