Turnitin
降AI改写
早检测系统
早降重系统
Turnitin-UK版
万方检测-期刊版
维普编辑部版
Grammarly检测
Paperpass检测
checkpass检测
PaperYY检测
What can transformers learn in-context? a case study of simple function classes
In-context learning is the ability of a model to condition on a prompt sequence consisting of
in-context examples (input-output pairs corresponding to some task) along with a new query …
in-context examples (input-output pairs corresponding to some task) along with a new query …
Leveraging large language models for predictive chemistry
Abstract Machine learning has transformed many fields and has recently found applications
in chemistry and materials science. The small datasets commonly found in chemistry …
in chemistry and materials science. The small datasets commonly found in chemistry …
Large language models as general pattern machines
We observe that pre-trained large language models (LLMs) are capable of autoregressively
completing complex token sequences--from arbitrary ones procedurally generated by …
completing complex token sequences--from arbitrary ones procedurally generated by …
Transformers as algorithms: Generalization and stability in in-context learning
In-context learning (ICL) is a type of prompting where a transformer model operates on a
sequence of (input, output) examples and performs inference on-the-fly. In this work, we …
sequence of (input, output) examples and performs inference on-the-fly. In this work, we …
Large language models on tabular data--a survey
Recent breakthroughs in large language modeling have facilitated rigorous exploration of
their application in diverse tasks related to tabular data modeling, such as prediction, tabular …
their application in diverse tasks related to tabular data modeling, such as prediction, tabular …
Language models are weak learners
A central notion in practical and theoretical machine learning is that of a weak learner,
classifiers that achieve better-than-random performance (on any given distribution over …
classifiers that achieve better-than-random performance (on any given distribution over …
Promptcast: A new prompt-based learning paradigm for time series forecasting
This paper presents a new perspective on time series forecasting. In existing time series
forecasting methods, the models take a sequence of numerical values as input and yield …
forecasting methods, the models take a sequence of numerical values as input and yield …
Large language models for time series: A survey
Large Language Models (LLMs) have seen significant use in domains such as natural
language processing and computer vision. Going beyond text, image and graphics, LLMs …
language processing and computer vision. Going beyond text, image and graphics, LLMs …
Contrast everything: A hierarchical contrastive framework for medical time-series
Y Wang, Y Han, H Wang… - Advances in Neural …, 2024 - proceedings.neurips.cc
Contrastive representation learning is crucial in medical time series analysis as it alleviates
dependency on labor-intensive, domain-specific, and scarce expert annotations. However …
dependency on labor-intensive, domain-specific, and scarce expert annotations. However …
Are large language models superhuman chemists?
Large language models (LLMs) have gained widespread interest due to their ability to
process human language and perform tasks on which they have not been explicitly trained …
process human language and perform tasks on which they have not been explicitly trained …