Deep learning meets sparse regularization: A signal processing perspective

R Parhi, RD Nowak - IEEE Signal Processing Magazine, 2023 - ieeexplore.ieee.org
Deep learning (DL) has been wildly successful in practice, and most of the state-of-the-art
machine learning methods are based on neural networks (NNs). Lacking, however, is a …

Learning with norm constrained, over-parameterized, two-layer neural networks

F Liu, L Dadi, V Cevher - Journal of Machine Learning Research, 2024 - jmlr.org
Recent studies show that a reproducing kernel Hilbert space (RKHS) is not a suitable space
to model functions by neural networks as the curse of dimensionality (CoD) cannot be …

Sparse machine learning in Banach spaces

Y Xu - Applied Numerical Mathematics, 2023 - Elsevier
The aim of this expository paper is to explain to graduate students and beginning
researchers in the field of mathematics, statistics and engineering the fundamental concept …

Optimal Rates of Approximation by Shallow ReLU Neural Networks and Applications to Nonparametric Regression

Y Yang, DX Zhou - Constructive Approximation, 2024 - Springer
We study the approximation capacity of some variation spaces corresponding to shallow
ReLU k neural networks. It is shown that sufficiently smooth functions are contained in these …

Ridges, neural networks, and the Radon transform

M Unser - Journal of Machine Learning Research, 2023 - jmlr.org
A ridge is a function that is characterized by a one-dimensional profile (activation) and a
multidimensional direction vector. Ridges appear in the theory of neural networks as …

Duality for neural networks through reproducing kernel Banach spaces

L Spek, TJ Heeringa, F Schwenninger… - arxiv preprint arxiv …, 2022 - arxiv.org
Reproducing Kernel Hilbert spaces (RKHS) have been a very successful tool in various
areas of machine learning. Recently, Barron spaces have been used to prove bounds on the …

Nonparametric regression using over-parameterized shallow ReLU neural networks

Y Yang, DX Zhou - Journal of Machine Learning Research, 2024 - jmlr.org
It is shown that over-parameterized neural networks can achieve minimax optimal rates of
convergence (up to logarithmic factors) for learning functions from certain smooth function …

Variation spaces for multi-output neural networks: Insights on multi-task learning and network compression

J Shenouda, R Parhi, K Lee, RD Nowak - Journal of Machine Learning …, 2024 - jmlr.org
This paper introduces a novel theoretical framework for the analysis of vector-valued neural
networks through the development of vector-valued variation spaces, a new class of …

Sparse representer theorems for learning in reproducing kernel Banach spaces

R Wang, Y Xu, M Yan - Journal of Machine Learning Research, 2024 - jmlr.org
Sparsity of a learning solution is a desirable feature in machine learning. Certain
reproducing kernel Banach spaces (RKBSs) are appropriate hypothesis spaces for sparse …

Neural reproducing kernel Banach spaces and representer theorems for deep networks

F Bartolucci, E De Vito, L Rosasco… - arxiv preprint arxiv …, 2024 - arxiv.org
Studying the function spaces defined by neural networks helps to understand the
corresponding learning models and their inductive bias. While in some limits neural …