Adaptivity of deep ReLU network for learning in Besov and mixed smooth Besov spaces: optimal rate and curse of dimensionality
T Suzuki - arxiv preprint arxiv:1810.08033, 2018 - arxiv.org
Deep learning has shown high performances in various types of tasks from visual
recognition to natural language processing, which indicates superior flexibility and adaptivity …
recognition to natural language processing, which indicates superior flexibility and adaptivity …
[BOOK][B] Hyperbolic cross approximation
D Dũng, V Temlyakov, T Ullrich - 2018 - books.google.com
This book provides a systematic survey of classical and recent results on hyperbolic cross
approximation. Motivated by numerous applications, the last two decades have seen great …
approximation. Motivated by numerous applications, the last two decades have seen great …
[PDF][PDF] Compressive Sensing.
M Fornasier, H Rauhut - Handbook of mathematical methods in …, 2015 - ee301.wikidot.com
Compressive sensing is a new type of sampling theory, which predicts that sparse signals
and images can be reconstructed from what was previously believed to be incomplete …
and images can be reconstructed from what was previously believed to be incomplete …
Sampling numbers of smoothness classes via ℓ1-minimization
Using techniques developed recently in the field of compressed sensing we prove new
upper bounds for general (nonlinear) sampling numbers of (quasi-) Banach smoothness …
upper bounds for general (nonlinear) sampling numbers of (quasi-) Banach smoothness …
Polyharmonic homogenization, rough polyharmonic splines and sparse super-localization
We introduce a new variational method for the numerical homogenization of divergence
form elliptic, parabolic and hyperbolic equations with arbitrary rough (L∞) coefficients. Our …
form elliptic, parabolic and hyperbolic equations with arbitrary rough (L∞) coefficients. Our …
[HTML][HTML] The Gelfand widths of ℓp-balls for 0< p≤ 1
The Gelfand widths of ℓp-balls for 0<p≤1 - ScienceDirect Skip to main contentSkip to article
Elsevier logo Journals & Books Search RegisterSign in View PDF Download full issue Search …
Elsevier logo Journals & Books Search RegisterSign in View PDF Download full issue Search …
Transformers are minimax optimal nonparametric in-context learners
In-context learning (ICL) of large language models has proven to be a surprisingly effective
method of learning a new task from only a few demonstrative examples. In this paper, we …
method of learning a new task from only a few demonstrative examples. In this paper, we …
Hyperbolic cross approximation
V Temlyakov, T Ullrich - 2016 - Springer
This book is a survey on multivariate approximation. The 20th century was a period of
transition from univariate problems to multivariate problems in a number of areas of …
transition from univariate problems to multivariate problems in a number of areas of …
[HTML][HTML] Optimal approximation of multivariate periodic Sobolev functions in the sup-norm
F Cobos, T Kühn, W Sickel - Journal of Functional Analysis, 2016 - Elsevier
Using tools from the theory of operator ideals and s-numbers, we develop a general
approach to transfer estimates for L 2-approximation of Sobolev functions into estimates for …
approach to transfer estimates for L 2-approximation of Sobolev functions into estimates for …
Flavors of compressive sensing
S Foucart - Approximation Theory XV: San Antonio 2016 15, 2017 - Springer
About a decade ago, a couple of groundbreaking articles revealed the possibility of faithfully
recovering high-dimensional signals from some seemingly incomplete information about …
recovering high-dimensional signals from some seemingly incomplete information about …