A comprehensive survey on pretrained foundation models: A history from bert to chatgpt
Abstract Pretrained Foundation Models (PFMs) are regarded as the foundation for various
downstream tasks across different data modalities. A PFM (eg, BERT, ChatGPT, GPT-4) is …
downstream tasks across different data modalities. A PFM (eg, BERT, ChatGPT, GPT-4) is …
[HTML][HTML] A comprehensive review on ensemble deep learning: Opportunities and challenges
In machine learning, two approaches outperform traditional algorithms: ensemble learning
and deep learning. The former refers to methods that integrate multiple base models in the …
and deep learning. The former refers to methods that integrate multiple base models in the …
Openagi: When llm meets domain experts
Human Intelligence (HI) excels at combining basic skills to solve complex tasks. This
capability is vital for Artificial Intelligence (AI) and should be embedded in comprehensive AI …
capability is vital for Artificial Intelligence (AI) and should be embedded in comprehensive AI …
Why can gpt learn in-context? language models implicitly perform gradient descent as meta-optimizers
Large pretrained language models have shown surprising in-context learning (ICL) ability.
With a few demonstration input-label pairs, they can predict the label for an unseen input …
With a few demonstration input-label pairs, they can predict the label for an unseen input …
Rlprompt: Optimizing discrete text prompts with reinforcement learning
Prompting has shown impressive success in enabling large pretrained language models
(LMs) to perform diverse NLP tasks, especially when only few downstream data are …
(LMs) to perform diverse NLP tasks, especially when only few downstream data are …
[HTML][HTML] Bidirectional convolutional recurrent neural network architecture with group-wise enhancement mechanism for text sentiment classification
Sentiment analysis has been a well-studied research direction in computational linguistics.
Deep neural network models, including convolutional neural networks (CNN) and recurrent …
Deep neural network models, including convolutional neural networks (CNN) and recurrent …
Metaicl: Learning to learn in context
We introduce MetaICL (Meta-training for In-Context Learning), a new meta-training
framework for few-shot learning where a pretrained language model is tuned to do in …
framework for few-shot learning where a pretrained language model is tuned to do in …
Sentiment analysis in the era of large language models: A reality check
Sentiment analysis (SA) has been a long-standing research area in natural language
processing. It can offer rich insights into human sentiments and opinions and has thus seen …
processing. It can offer rich insights into human sentiments and opinions and has thus seen …
A survey on text classification: From traditional to deep learning
Text classification is the most fundamental and essential task in natural language
processing. The last decade has seen a surge of research in this area due to the …
processing. The last decade has seen a surge of research in this area due to the …