Random features for kernel approximation: A survey on algorithms, theory, and beyond

F Liu, X Huang, Y Chen… - IEEE Transactions on …, 2021‏ - ieeexplore.ieee.org
The class of random features is one of the most popular techniques to speed up kernel
methods in large-scale problems. Related works have been recognized by the NeurIPS Test …

One-pass distribution sketch for measuring data heterogeneity in federated learning

Z Liu, Z Xu, B Coleman… - Advances in Neural …, 2023‏ - proceedings.neurips.cc
Federated learning (FL) is a machine learning paradigm where multiple client devices train
models collaboratively without data exchange. Data heterogeneity problem is naturally …

Smooth flip** probability for differential private sign random projection methods

P Li, X Li - Advances in Neural Information Processing …, 2023‏ - proceedings.neurips.cc
We develop a series of differential privacy (DP) algorithms from a family of random
projection (RP) and sign random projection (SignRP) methods. We first show how to …

Learning a fourier transform for linear relative positional encodings in transformers

K Choromanski, S Li, V Likhosherstov… - International …, 2024‏ - proceedings.mlr.press
We propose a new class of linear Transformers called FourierLearner-Transformers (FLTs),
which incorporate a wide range of relative positional encoding mechanisms (RPEs). These …

Low-precision arithmetic for fast Gaussian processes

WJ Maddox, A Potapcynski… - Uncertainty in Artificial …, 2022‏ - proceedings.mlr.press
Low precision arithmetic has had a transformative effect on the training of neural networks,
reducing computation, memory and energy requirements. However, despite their promise …

Binding in hippocampal-entorhinal circuits enables compositionality in cognitive maps

C Kymn, S Mazelet, A Thomas… - Advances in …, 2025‏ - proceedings.neurips.cc
We propose a normative model for spatial representation in the hippocampal formation that
combines optimality principles, such as maximizing coding range and spatial information per …

One-sketch-for-all: Non-linear random features from compressed linear measurements

X Li, P Li - … Conference on Artificial Intelligence and Statistics, 2021‏ - proceedings.mlr.press
The commonly used Gaussian kernel has a tuning parameter $\gamma $. This makes the
design of quantization schemes for random Fourier features (RFF) challenging, which is a …

Random matrices in service of ml footprint: ternary random features with no performance loss

HT Ali, Z Liao, R Couillet - arxiv preprint arxiv:2110.01899, 2021‏ - arxiv.org
In this article, we investigate the spectral behavior of random features kernel matrices of the
type ${\bf K}=\mathbb {E} _ {{\bf w}}\left [\sigma\left ({\bf w}^{\sf T}{\bf x} _i\right)\sigma\left ({\bf …

Signrff: Sign random fourier features

X Li, P Li - Advances in Neural Information Processing …, 2022‏ - proceedings.neurips.cc
The industry practice has been moving to embedding based retrieval (EBR). For example, in
many applications, the embedding vectors are trained by some form of two-tower models …

Retaining knowledge for learning with dynamic definition

Z Liu, B Coleman, T Zhang… - Advances in Neural …, 2022‏ - proceedings.neurips.cc
Abstract Machine learning models are often deployed in settings where they must be
constantly updated in response to the changes in class definitions while retaining high …