A comprehensive survey of ai-generated content (aigc): A history of generative ai from gan to chatgpt

Y Cao, S Li, Y Liu, Z Yan, Y Dai, PS Yu… - arxiv preprint arxiv …, 2023 - arxiv.org
Recently, ChatGPT, along with DALL-E-2 and Codex, has been gaining significant attention
from society. As a result, many individuals have become interested in related resources and …

A comprehensive survey on pretrained foundation models: A history from bert to chatgpt

C Zhou, Q Li, C Li, J Yu, Y Liu, G Wang… - International Journal of …, 2024 - Springer
Abstract Pretrained Foundation Models (PFMs) are regarded as the foundation for various
downstream tasks across different data modalities. A PFM (eg, BERT, ChatGPT, GPT-4) is …

Exploring the potential of large language models (llms) in learning on graphs

Z Chen, H Mao, H Li, W **, H Wen, X Wei… - ACM SIGKDD …, 2024 - dl.acm.org
Learning on Graphs has attracted immense attention due to its wide real-world applications.
The most popular pipeline for learning on graphs with textual node attributes primarily relies …

[HTML][HTML] Pre-trained language models and their applications

H Wang, J Li, H Wu, E Hovy, Y Sun - Engineering, 2023 - Elsevier
Pre-trained language models have achieved striking success in natural language
processing (NLP), leading to a paradigm shift from supervised learning to pre-training …

Codet5: Identifier-aware unified pre-trained encoder-decoder models for code understanding and generation

Y Wang, W Wang, S Joty, SCH Hoi - arxiv preprint arxiv:2109.00859, 2021 - arxiv.org
Pre-trained models for Natural Languages (NL) like BERT and GPT have been recently
shown to transfer well to Programming Languages (PL) and largely benefit a broad set of …

Pre-train, prompt, and predict: A systematic survey of prompting methods in natural language processing

P Liu, W Yuan, J Fu, Z Jiang, H Hayashi… - ACM Computing …, 2023 - dl.acm.org
This article surveys and organizes research works in a new paradigm in natural language
processing, which we dub “prompt-based learning.” Unlike traditional supervised learning …

A survey of large language models in medicine: Progress, application, and challenge

H Zhou, F Liu, B Gu, X Zou, J Huang, J Wu, Y Li… - arxiv preprint arxiv …, 2023 - arxiv.org
Large language models (LLMs), such as ChatGPT, have received substantial attention due
to their capabilities for understanding and generating human language. While there has …

A survey on text classification: From traditional to deep learning

Q Li, H Peng, J Li, C **a, R Yang, L Sun… - ACM Transactions on …, 2022 - dl.acm.org
Text classification is the most fundamental and essential task in natural language
processing. The last decade has seen a surge of research in this area due to the …

Ernie-vilg 2.0: Improving text-to-image diffusion model with knowledge-enhanced mixture-of-denoising-experts

Z Feng, Z Zhang, X Yu, Y Fang, L Li… - Proceedings of the …, 2023 - openaccess.thecvf.com
Recent progress in diffusion models has revolutionized the popular technology of text-to-
image generation. While existing approaches could produce photorealistic high-resolution …

A robustly optimized BERT pre-training approach with post-training

Z Liu, W Lin, Y Shi, J Zhao - China National Conference on Chinese …, 2021 - Springer
In the paper, we present a ''+''+''three-stage paradigm, which is a supplementary framework
for the standard ''+''language model approach. Furthermore, based on three-stage paradigm …