Cloob: Modern hopfield networks with infoloob outperform clip

A Fürst, E Rumetshofer, J Lehner… - Advances in neural …, 2022 - proceedings.neurips.cc
CLIP yielded impressive results on zero-shot transfer learning tasks and is considered as a
foundation model like BERT or GPT3. CLIP vision models that have a rich representation are …

Risk-averse heteroscedastic bayesian optimization

A Makarova, I Usmanova… - Advances in Neural …, 2021 - proceedings.neurips.cc
Many black-box optimization tasks arising in high-stakes applications require risk-averse
decisions. The standard Bayesian optimization (BO) paradigm, however, optimizes the …

Correlated noise provably beats independent noise for differentially private learning

CA Choquette-Choo, K Dvijotham, K Pillutla… - arxiv preprint arxiv …, 2023 - arxiv.org
Differentially private learning algorithms inject noise into the learning process. While the
most common private learning algorithm, DP-SGD, adds independent Gaussian noise in …

Faster differentially private convex optimization via second-order methods

A Ganesh, M Haghifam, T Steinke… - Advances in Neural …, 2023 - proceedings.neurips.cc
Differentially private (stochastic) gradient descent is the workhorse of DP private machine
learning in both the convex and non-convex settings. Without privacy constraints, second …

Pac-bayes-chernoff bounds for unbounded losses

I Casado, LA Ortega, AR Masegosa, A Pérez - arxiv preprint arxiv …, 2024 - arxiv.org
We introduce a new PAC-Bayes oracle bound for unbounded losses. This result can be
understood as a PAC-Bayesian version of the Cram\'er-Chernoff bound. The proof technique …

A stochastic subspace approach to gradient-free optimization in high dimensions

D Kozak, S Becker, A Doostan, L Tenorio - … Optimization and Applications, 2021 - Springer
We present a stochastic descent algorithm for unconstrained optimization that is particularly
efficient when the objective function is slow to evaluate and gradients are not easily …

PAC-Bayes-Chernoff bounds for unbounded losses

I Casado Telletxea, LA Ortega Andrés… - Advances in …, 2025 - proceedings.neurips.cc
We introduce a new PAC-Bayes oracle bound for unbounded losses that extends Cramér-
Chernoff bounds to the PAC-Bayesian setting. The proof technique relies on controlling the …

Conditional mean estimation in Gaussian noise: A meta derivative identity with applications

A Dytso, HV Poor, SS Shitz - IEEE Transactions on Information …, 2022 - ieeexplore.ieee.org
Consider a channel where is an-dimensional random vector, and is a multivariate Gaussian
vector with a full-rank covariance matrix. The object under consideration in this paper is the …

Strictly subgaussian probability distributions

SG Bobkov, GP Chistyakov, F Götze - Electronic Journal of …, 2024 - projecteuclid.org
We explore probability distributions on the real line whose Laplace transform admits an
upper bound of subgaussian type known as strict subgaussianity. One class in this family …

A general derivative identity for the conditional mean estimator in Gaussian noise and some applications

A Dytso, HV Poor, SS Shitz - 2020 IEEE International …, 2020 - ieeexplore.ieee.org
This paper provides a general derivative identity for the conditional mean estimator of an
arbitrary vector signal in Gaussian noise with an arbitrary covariance matrix. This new …