Turnitin
降AI改写
早检测系统
早降重系统
Turnitin-UK版
万方检测-期刊版
维普编辑部版
Grammarly检测
Paperpass检测
checkpass检测
PaperYY检测
Neural network approximation
Neural networks (NNs) are the method of choice for building learning algorithms. They are
now being investigated for other numerical tasks such as solving high-dimensional partial …
now being investigated for other numerical tasks such as solving high-dimensional partial …
Foundational challenges in assuring alignment and safety of large language models
This work identifies 18 foundational challenges in assuring the alignment and safety of large
language models (LLMs). These challenges are organized into three different categories …
language models (LLMs). These challenges are organized into three different categories …
The power of quantum neural networks
It is unknown whether near-term quantum computers are advantageous for machine
learning tasks. In this work we address this question by trying to understand how powerful …
learning tasks. In this work we address this question by trying to understand how powerful …
Intrinsic dimensionality explains the effectiveness of language model fine-tuning
Although pretrained language models can be fine-tuned to produce state-of-the-art results
for a very wide range of language understanding tasks, the dynamics of this process are not …
for a very wide range of language understanding tasks, the dynamics of this process are not …
Pruning neural networks without any data by iteratively conserving synaptic flow
Pruning the parameters of deep neural networks has generated intense interest due to
potential savings in time, memory and energy both during training and at test time. Recent …
potential savings in time, memory and energy both during training and at test time. Recent …
Task-specific skill localization in fine-tuned language models
Pre-trained language models can be fine-tuned to solve diverse NLP tasks, including in few-
shot settings. Thus fine-tuning allows the model to quickly pick up task-specific" skills," but …
shot settings. Thus fine-tuning allows the model to quickly pick up task-specific" skills," but …
A primer on Bayesian neural networks: review and debates
Neural networks have achieved remarkable performance across various problem domains,
but their widespread applicability is hindered by inherent limitations such as overconfidence …
but their widespread applicability is hindered by inherent limitations such as overconfidence …
Learning imbalanced datasets with label-distribution-aware margin loss
Deep learning algorithms can fare poorly when the training dataset suffers from heavy class-
imbalance but the testing criterion requires good generalization on less frequent classes …
imbalance but the testing criterion requires good generalization on less frequent classes …
Fantastic generalization measures and where to find them
Generalization of deep networks has been of great interest in recent years, resulting in a
number of theoretically and empirically motivated complexity measures. However, most …
number of theoretically and empirically motivated complexity measures. However, most …
Fine-grained analysis of optimization and generalization for overparameterized two-layer neural networks
Recent works have cast some light on the mystery of why deep nets fit any data and
generalize despite being very overparametrized. This paper analyzes training and …
generalize despite being very overparametrized. This paper analyzes training and …