The curse of overparametrization in adversarial training: Precise analysis of robust generalization for random features regression

H Hassani, A Javanmard - The Annals of Statistics, 2024 - projecteuclid.org
The curse of overparametrization in adversarial training: Precise analysis of robust
generalization for random features regressi Page 1 The Annals of Statistics 2024, Vol. 52, No. 2 …

Gardner formula for Ising perceptron models at small densities

E Bolthausen, S Nakajima, N Sun… - Conference on Learning …, 2022 - proceedings.mlr.press
We consider the Ising perceptron model with N spins and M= N* alpha patterns, with a
general activation function U that is bounded above. For U bounded away from zero, or U a …

Sharp threshold sequence and universality for ising perceptron models

S Nakajima, N Sun - Proceedings of the 2023 Annual ACM-SIAM …, 2023 - SIAM
We study a family of Ising perceptron models with {0, 1}-valued activation functions. This
includes the classical half-space models, as well as some of the symmetric models …

Algorithmic pure states for the negative spherical perceptron

A El Alaoui, M Sellke - Journal of Statistical Physics, 2022 - Springer
We consider the spherical perceptron with Gaussian disorder. This is the set S of points σ∈
RN on the sphere of radius N satisfying⟨ ga, σ⟩≥ κ N for all 1≤ a≤ M, where (ga) a= 1 M …

Discrepancy algorithms for the binary perceptron

S Li, T Schramm, K Zhou - arxiv preprint arxiv:2408.00796, 2024 - arxiv.org
The binary perceptron problem asks us to find a sign vector in the intersection of
independently chosen random halfspaces with intercept $-\kappa $. We analyze the …

Typical and atypical solutions in nonconvex neural networks with discrete and continuous weights

C Baldassi, EM Malatesta, G Perugini, R Zecchina - Physical Review E, 2023 - APS
We study the binary and continuous negative-margin perceptrons as simple nonconvex
neural network models learning random rules and associations. We analyze the geometry of …

Dynamical mean field theory for models of confluent tissues and beyond

PJ Kamali, P Urbani - SciPost Physics, 2023 - scipost.org
We consider a recently proposed model to understand the rigidity transition in confluent
tissues and we derive the dynamical mean field theory (DMFT) equations that describes …

Gaussian universality of perceptrons with random labels

F Gerace, F Krzakala, B Loureiro, L Stephan… - Physical Review E, 2024 - APS
While classical in many theoretical settings—and in particular in statistical physics-inspired
works—the assumption of Gaussian iid input data is often perceived as a strong limitation in …

[HTML][HTML] Injectivity of ReLU networks: perspectives from statistical physics

A Maillard, AS Bandeira, D Belius, I Dokmanić… - Applied and …, 2025 - Elsevier
When can the input of a ReLU neural network be inferred from its output? In other words,
when is the network injective? We consider a single layer, x↦ ReLU (W x), with a random …

No free prune: Information-theoretic barriers to pruning at initialization

T Kumar, K Luo, M Sellke - arxiv preprint arxiv:2402.01089, 2024 - arxiv.org
The existence of" lottery tickets" arxiv: 1803.03635 at or near initialization raises the
tantalizing question of whether large models are necessary in deep learning, or whether …