A review of distributed statistical inference

Y Gao, W Liu, H Wang, X Wang, Y Yan… - Statistical Theory and …, 2022 - Taylor & Francis
The rapid emergence of massive datasets in various fields poses a serious challenge to
traditional statistical methods. Meanwhile, it provides opportunities for researchers to …

Distributed estimation of principal eigenspaces

J Fan, D Wang, K Wang, Z Zhu - Annals of statistics, 2019 - pmc.ncbi.nlm.nih.gov
Principal component analysis (PCA) is fundamental to statistical machine learning. It extracts
latent principal factors that contribute to the most variation of the data. When data are stored …

Sobolev norm learning rates for regularized least-squares algorithms

S Fischer, I Steinwart - Journal of Machine Learning Research, 2020 - jmlr.org
Learning rates for least-squares regression are typically expressed in terms of L 2-norms. In
this paper we extend these rates to norms stronger than the L 2-norm without requiring the …

Distributed semi-supervised learning with kernel ridge regression

X Chang, SB Lin, DX Zhou - Journal of Machine Learning Research, 2017 - jmlr.org
This paper provides error analysis for distributed semi-supervised learning with kernel ridge
regression (DSKRR) based on a divide-and-conquer strategy. DSKRR applies kernel ridge …

On the optimality of misspecified spectral algorithms

H Zhang, Y Li, Q Lin - Journal of Machine Learning Research, 2024 - jmlr.org
In the misspecified spectral algorithms problem, researchers usually assume the
underground true function $ f_ {\rho}^{*}\in [\mathcal {H}]^{s} $, a less-smooth interpolation …

Distributed kernel-based gradient descent algorithms

SB Lin, DX Zhou - Constructive Approximation, 2018 - Springer
We study the generalization ability of distributed learning equipped with a divide-and-
conquer approach and gradient descent algorithm in a reproducing kernel Hilbert space …

Distributed kernel ridge regression with communications

SB Lin, D Wang, DX Zhou - Journal of Machine Learning Research, 2020 - jmlr.org
This paper focuses on generalization performance analysis for distributed algorithms in the
framework of learning theory. Taking distributed kernel ridge regression (DKRR) for …

Statistical optimality of divide and conquer kernel-based functional linear regression

J Liu, L Shi - Journal of Machine Learning Research, 2024 - jmlr.org
Previous analysis of regularized functional linear regression in a reproducing kernel Hilbert
space (RKHS) typically requires the target function to be contained in this kernel space. This …

Generalization performance of multi-pass stochastic gradient descent with convex loss functions

Y Lei, T Hu, K Tang - Journal of Machine Learning Research, 2021 - jmlr.org
Stochastic gradient descent (SGD) has become the method of choice to tackle large-scale
datasets due to its low computational cost and good practical performance. Learning rate …

Optimal convergence for distributed learning with stochastic gradient methods and spectral algorithms

J Lin, V Cevher - Journal of Machine Learning Research, 2020 - jmlr.org
We study generalization properties of distributed algorithms in the setting of nonparametric
regression over a reproducing kernel Hilbert space (RKHS). We first investigate distributed …