Ammus: A survey of transformer-based pretrained models in natural language processing
KS Kalyan, A Rajasekharan, S Sangeetha - arxiv preprint arxiv …, 2021 - arxiv.org
Transformer-based pretrained language models (T-PTLMs) have achieved great success in
almost every NLP task. The evolution of these models started with GPT and BERT. These …
almost every NLP task. The evolution of these models started with GPT and BERT. These …
Language model behavior: A comprehensive survey
Transformer language models have received widespread public attention, yet their
generated text is often surprising even to NLP researchers. In this survey, we discuss over …
generated text is often surprising even to NLP researchers. In this survey, we discuss over …
Few-shot learning with language models: Learning from instructions and contexts
T Schick - 2022 - edoc.ub.uni-muenchen.de
Pretraining deep neural networks to perform language modeling–that is, to reconstruct
missing words from incomplete pieces of text–has brought large improvements throughout …
missing words from incomplete pieces of text–has brought large improvements throughout …
Quantifying the impact of Twitter activity in political battlegrounds
M Kaur Baxi - 2022 - thesis.lakeheadu.ca
It may be challenging to determine the reach of the information, how well it corresponds with
the domain design, and how to utilize it as a communication medium when utilizing social …
the domain design, and how to utilize it as a communication medium when utilizing social …