[HTML][HTML] ChatGPT: A comprehensive review on background, applications, key challenges, bias, ethics, limitations and future scope

PP Ray - Internet of Things and Cyber-Physical Systems, 2023 - Elsevier
In recent years, artificial intelligence (AI) and machine learning have been transforming the
landscape of scientific research. Out of which, the chatbot technology has experienced …

Parameter-efficient fine-tuning methods for pretrained language models: A critical review and assessment

L Xu, H ** attention heads do nothing
Y Bondarenko, M Nagel… - Advances in Neural …, 2023 - proceedings.neurips.cc
Transformer models have been widely adopted in various domains over the last years and
especially large language models have advanced the field of AI significantly. Due to their …

[HTML][HTML] A survey of transformers

T Lin, Y Wang, X Liu, X Qiu - AI open, 2022 - Elsevier
Transformers have achieved great success in many artificial intelligence fields, such as
natural language processing, computer vision, and audio processing. Therefore, it is natural …

Knowledge neurons in pretrained transformers

D Dai, L Dong, Y Hao, Z Sui, B Chang, F Wei - arxiv preprint arxiv …, 2021 - arxiv.org
Large-scale pretrained language models are surprisingly good at recalling factual
knowledge presented in the training corpus. In this paper, we present preliminary studies on …

Ammus: A survey of transformer-based pretrained models in natural language processing

KS Kalyan, A Rajasekharan, S Sangeetha - arxiv preprint arxiv …, 2021 - arxiv.org
Transformer-based pretrained language models (T-PTLMs) have achieved great success in
almost every NLP task. The evolution of these models started with GPT and BERT. These …