A comprehensive survey of ai-generated content (aigc): A history of generative ai from gan to chatgpt
Recently, ChatGPT, along with DALL-E-2 and Codex, has been gaining significant attention
from society. As a result, many individuals have become interested in related resources and …
from society. As a result, many individuals have become interested in related resources and …
A comprehensive survey on pretrained foundation models: A history from bert to chatgpt
Abstract Pretrained Foundation Models (PFMs) are regarded as the foundation for various
downstream tasks across different data modalities. A PFM (eg, BERT, ChatGPT, GPT-4) is …
downstream tasks across different data modalities. A PFM (eg, BERT, ChatGPT, GPT-4) is …
Exploring the potential of large language models (llms) in learning on graphs
Learning on Graphs has attracted immense attention due to its wide real-world applications.
The most popular pipeline for learning on graphs with textual node attributes primarily relies …
The most popular pipeline for learning on graphs with textual node attributes primarily relies …
[HTML][HTML] Pre-trained language models and their applications
Pre-trained language models have achieved striking success in natural language
processing (NLP), leading to a paradigm shift from supervised learning to pre-training …
processing (NLP), leading to a paradigm shift from supervised learning to pre-training …
Codet5: Identifier-aware unified pre-trained encoder-decoder models for code understanding and generation
Pre-trained models for Natural Languages (NL) like BERT and GPT have been recently
shown to transfer well to Programming Languages (PL) and largely benefit a broad set of …
shown to transfer well to Programming Languages (PL) and largely benefit a broad set of …
Pre-train, prompt, and predict: A systematic survey of prompting methods in natural language processing
This article surveys and organizes research works in a new paradigm in natural language
processing, which we dub “prompt-based learning.” Unlike traditional supervised learning …
processing, which we dub “prompt-based learning.” Unlike traditional supervised learning …
A survey of large language models in medicine: Progress, application, and challenge
Large language models (LLMs), such as ChatGPT, have received substantial attention due
to their capabilities for understanding and generating human language. While there has …
to their capabilities for understanding and generating human language. While there has …
A survey on text classification: From traditional to deep learning
Text classification is the most fundamental and essential task in natural language
processing. The last decade has seen a surge of research in this area due to the …
processing. The last decade has seen a surge of research in this area due to the …
Ernie-vilg 2.0: Improving text-to-image diffusion model with knowledge-enhanced mixture-of-denoising-experts
Recent progress in diffusion models has revolutionized the popular technology of text-to-
image generation. While existing approaches could produce photorealistic high-resolution …
image generation. While existing approaches could produce photorealistic high-resolution …
A robustly optimized BERT pre-training approach with post-training
Z Liu, W Lin, Y Shi, J Zhao - China National Conference on Chinese …, 2021 - Springer
In the paper, we present a ''+''+''three-stage paradigm, which is a supplementary framework
for the standard ''+''language model approach. Furthermore, based on three-stage paradigm …
for the standard ''+''language model approach. Furthermore, based on three-stage paradigm …