Pd-sparse: A primal and dual sparse approach to extreme multiclass and multilabel classification

IEH Yen, X Huang, P Ravikumar… - … on machine learning, 2016 - proceedings.mlr.press
Abstract We consider Multiclass and Multilabel classification with extremely large number of
classes, of which only few are labeled to each instance. In such setting, standard methods …

Ppdsparse: A parallel primal-dual sparse method for extreme classification

IEH Yen, X Huang, W Dai, P Ravikumar… - Proceedings of the 23rd …, 2017 - dl.acm.org
Extreme Classification comprises multi-class or multi-label prediction where there is a large
number of classes, and is increasingly relevant to many real-world applications such as text …

Random war** series: A random features method for time-series embedding

L Wu, IEH Yen, J Yi, F Xu, Q Lei… - International …, 2018 - proceedings.mlr.press
Time series data analytics has been a problem of substantial interests for decades, and
Dynamic Time War** (DTW) has been the most widely adopted technique to measure …

Generalization bounds for sparse random feature expansions

A Hashemi, H Schaeffer, R Shi, U Topcu, G Tran… - Applied and …, 2023 - Elsevier
Random feature methods have been successful in various machine learning tasks, are easy
to compute, and come with theoretical accuracy bounds. They serve as an alternative …

Conditioning of random feature matrices: Double descent and generalization error

Z Chen, H Schaeffer - arxiv preprint arxiv:2110.11477, 2021 - arxiv.org
We provide (high probability) bounds on the condition number of random feature matrices. In
particular, we show that if the complexity ratio $\frac {N}{m} $ where $ N $ is the number of …

CROification: Accurate kernel classification with the efficiency of sparse linear SVM

M Kafai, K Eshghi - IEEE transactions on pattern analysis and …, 2017 - ieeexplore.ieee.org
Kernel methods have been shown to be effective for many machine learning tasks such as
classification and regression. In particular, support vector machines with the Gaussian …

Shrimp: Sparser random feature models via iterative magnitude pruning

Y **e, R Shi, H Schaeffer… - … and Scientific Machine …, 2022 - proceedings.mlr.press
Sparse shrunk additive models and sparse random feature models have been developed
separately as methods to learn low-order functions, where there are few interactions …

Linearized GMM kernels and normalized random Fourier features

P Li - Proceedings of the 23rd ACM SIGKDD international …, 2017 - dl.acm.org
The method of" random Fourier features (RFF)" has become a popular tool for approximating
the" radial basis function (RBF)" kernel. The variance of RFF is actually large. Interestingly …

Low-precision random Fourier features for memory-constrained kernel approximation

J Zhang, A May, T Dao, C Ré - The 22nd International …, 2019 - proceedings.mlr.press
We investigate how to train kernel approximation methods that generalize well under a
memory budget. Building on recent theoretical work, we define a measure of kernel …

Scalable spectral clustering using random binning features

L Wu, PY Chen, IEH Yen, F Xu, Y **a… - Proceedings of the 24th …, 2018 - dl.acm.org
Spectral clustering is one of the most effective clustering approaches that capture hidden
cluster structures in the data. However, it does not scale well to large-scale problems due to …