Overview frequency principle/spectral bias in deep learning
Understanding deep learning is increasingly emergent as it penetrates more and more into
industry and science. In recent years, a research line from Fourier analysis sheds light on …
industry and science. In recent years, a research line from Fourier analysis sheds light on …
Robust heterogeneous federated learning under data corruption
Abstract Model heterogeneous federated learning is a realistic and challenging problem.
However, due to the limitations of data collection, storage, and transmission conditions, as …
However, due to the limitations of data collection, storage, and transmission conditions, as …
Single chip photonic deep neural network with accelerated training
As deep neural networks (DNNs) revolutionize machine learning, energy consumption and
throughput are emerging as fundamental limitations of CMOS electronics. This has …
throughput are emerging as fundamental limitations of CMOS electronics. This has …
Word order does matter and shuffled language models know it
Recent studies have shown that language models pretrained and/or fine-tuned on randomly
permuted sentences exhibit competitive performance on GLUE, putting into question the …
permuted sentences exhibit competitive performance on GLUE, putting into question the …
Fedfa: Federated feature augmentation
Federated learning is a distributed paradigm that allows multiple parties to collaboratively
train deep models without exchanging the raw data. However, the data distribution among …
train deep models without exchanging the raw data. However, the data distribution among …
Noisy recurrent neural networks
We provide a general framework for studying recurrent neural networks (RNNs) trained by
injecting noise into hidden states. Specifically, we consider RNNs that can be viewed as …
injecting noise into hidden states. Specifically, we consider RNNs that can be viewed as …
Explicit regularization in overparametrized models via noise injection
Injecting noise within gradient descent has several desirable features, such as smoothing
and regularizing properties. In this paper, we investigate the effects of injecting noise before …
and regularizing properties. In this paper, we investigate the effects of injecting noise before …
Noisy feature mixup
We introduce Noisy Feature Mixup (NFM), an inexpensive yet effective method for data
augmentation that combines the best of interpolation based training and noise injection …
augmentation that combines the best of interpolation based training and noise injection …
Neural decoding reveals specialized kinematic tuning after an abrupt cortical transition
The primary motor cortex (M1) exhibits a protracted period of development, including the
development of a sensory representation long before motor outflow emerges. In rats, this …
development of a sensory representation long before motor outflow emerges. In rats, this …
On the generalization of models trained with SGD: Information-theoretic bounds and implications
This paper follows up on a recent work of Neu et al.(2021) and presents some new
information-theoretic upper bounds for the generalization error of machine learning models …
information-theoretic upper bounds for the generalization error of machine learning models …