What are higher-order networks?

C Bick, E Gross, HA Harrington, MT Schaub - SIAM review, 2023 - SIAM
Network-based modeling of complex systems and data using the language of graphs has
become an essential topic across a range of different disciplines. Arguably, this graph-based …

Disordered systems insights on computational hardness

D Gamarnik, C Moore… - Journal of Statistical …, 2022 - iopscience.iop.org
In this review article we discuss connections between the physics of disordered systems,
phase transitions in inference problems, and computational hardness. We introduce two …

High-dimensional limit theorems for sgd: Effective dynamics and critical scaling

G Ben Arous, R Gheissari… - Advances in neural …, 2022 - proceedings.neurips.cc
We study the scaling limits of stochastic gradient descent (SGD) with constant step-size in
the high-dimensional regime. We prove limit theorems for the trajectories of summary …

Sampling with flows, diffusion, and autoregressive neural networks from a spin-glass perspective

D Ghio, Y Dandi, F Krzakala, L Zdeborová - Proceedings of the National …, 2024 - pnas.org
Recent years witnessed the development of powerful generative models based on flows,
diffusion, or autoregressive neural networks, achieving remarkable success in generating …

Tensor SVD: Statistical and computational limits

A Zhang, D **a - IEEE Transactions on Information Theory, 2018 - ieeexplore.ieee.org
In this paper, we propose a general framework for tensor singular value decomposition
(tensor singular value decomposition (SVD)), which focuses on the methodology and theory …

Notes on computational hardness of hypothesis testing: Predictions using the low-degree likelihood ratio

D Kunisky, AS Wein, AS Bandeira - ISAAC Congress (International Society …, 2019 - Springer
These notes survey and explore an emerging method, which we call the low-degree
method, for understanding statistical-versus-computational tradeoffs in high-dimensional …

Online stochastic gradient descent on non-convex losses from high-dimensional inference

GB Arous, R Gheissari, A Jagannath - Journal of Machine Learning …, 2021 - jmlr.org
Stochastic gradient descent (SGD) is a popular algorithm for optimization problems arising
in high-dimensional inference tasks. Here one produces an estimator of an unknown …

Bayes-optimal learning of an extensive-width neural network from quadratically many samples

A Maillard, E Troiani, S Martin… - Advances in …, 2025 - proceedings.neurips.cc
We consider the problem of learning a target function corresponding to a singlehidden layer
neural network, with a quadratic activation function after the first layer, and random weights …

Reducibility and statistical-computational gaps from secret leakage

M Brennan, G Bresler - Conference on Learning Theory, 2020 - proceedings.mlr.press
Inference problems with conjectured statistical-computational gaps are ubiquitous
throughout modern statistics, computer science, statistical physics and discrete probability …

The adaptive interpolation method: a simple scheme to prove replica formulas in Bayesian inference

J Barbier, N Macris - Probability theory and related fields, 2019 - Springer
In recent years important progress has been achieved towards proving the validity of the
replica predictions for the (asymptotic) mutual information (or “free energy”) in Bayesian …