Turnitin
降AI改写
早检测系统
早降重系统
Turnitin-UK版
万方检测-期刊版
维普编辑部版
Grammarly检测
Paperpass检测
checkpass检测
PaperYY检测
Trained transformers learn linear models in-context
Attention-based neural networks such as transformers have demonstrated a remarkable
ability to exhibit in-context learning (ICL): Given a short prompt sequence of tokens from an …
ability to exhibit in-context learning (ICL): Given a short prompt sequence of tokens from an …
Understanding in-context learning in transformers and llms by learning to learn discrete functions
In order to understand the in-context learning phenomenon, recent works have adopted a
stylized experimental framework and demonstrated that Transformers can learn gradient …
stylized experimental framework and demonstrated that Transformers can learn gradient …
A theoretical understanding of self-correction through in-context alignment
Going beyond mimicking limited human experiences, recent studies show initial evidence
that, like humans, large language models (LLMs) are capable of improving their abilities …
that, like humans, large language models (LLMs) are capable of improving their abilities …
Drift-resilient tabPFN: In-context learning temporal distribution shifts on tabular data
While most ML models expect independent and identically distributed data, this assumption
is often violated in real-world scenarios due to distribution shifts, resulting in the degradation …
is often violated in real-world scenarios due to distribution shifts, resulting in the degradation …
On mesa-optimization in autoregressively trained transformers: Emergence and capability
Autoregressively trained transformers have brought a profound revolution to the world,
especially with their in-context learning (ICL) ability to address downstream tasks. Recently …
especially with their in-context learning (ICL) ability to address downstream tasks. Recently …
How well does gpt-4v (ision) adapt to distribution shifts? a preliminary investigation
In machine learning, generalization against distribution shifts--where deployment conditions
diverge from the training scenarios--is crucial, particularly in fields like climate modeling …
diverge from the training scenarios--is crucial, particularly in fields like climate modeling …
How In-Context Learning Emerges from Training on Unstructured Data: On the Role of Co-Occurrence, Positional Information, and Noise Structures
Large language models (LLMs) like transformers have impressive in-context learning (ICL)
capabilities; they can generate predictions for new queries based on input-output …
capabilities; they can generate predictions for new queries based on input-output …
From Unstructured Data to In-Context Learning: Exploring What Tasks Can Be Learned and When
Large language models (LLMs) like transformers demonstrate impressive in-context
learning (ICL) capabilities, allowing them to make predictions for new tasks based on prompt …
learning (ICL) capabilities, allowing them to make predictions for new tasks based on prompt …
In-context learning in presence of spurious correlations
Large language models exhibit a remarkable capacity for in-context learning, where they
learn to solve tasks given a few examples. Recent work has shown that transformers can be …
learn to solve tasks given a few examples. Recent work has shown that transformers can be …
Technical Debt in In-Context Learning: Diminishing Efficiency in Long Context
Transformers have demonstrated remarkable in-context learning (ICL) capabilities, adapting
to new tasks by simply conditioning on demonstrations without parameter updates …
to new tasks by simply conditioning on demonstrations without parameter updates …