[BOK][B] Sufficient dimension reduction: Methods and applications with R

B Li - 2018 - taylorfrancis.com
Sufficient dimension reduction is a rapidly develo** research field that has wide
applications in regression diagnostics, data visualization, machine learning, genomics …

Dimension reduction for high-dimensional data

L Li - Statistical methods in molecular biology, 2010 - Springer
With advancing of modern technologies, high-dimensional data have prevailed in
computational biology. The number of variables p is very large, and in many applications, p …

Kernel Partial Correlation Coefficient---a Measure of Conditional Dependence

Z Huang, N Deb, B Sen - Journal of Machine Learning Research, 2022 - jmlr.org
We propose and study a class of simple, nonparametric, yet interpretable measures of
conditional dependence, which we call kernel partial correlation (KPC) coefficient, between …

[HTML][HTML] High dimensional single index models

P Radchenko - Journal of Multivariate Analysis, 2015 - Elsevier
This paper addresses the problem of fitting nonlinear regression models in high-
dimensional situations, where the number of predictors, p, is large relative to the number of …

Sparse SIR: Optimal rates and adaptive estimation

K Tan, L Shi, Z Yu - 2020 - projecteuclid.org
Sparse SIR: Optimal rates and adaptive estimation Page 1 The Annals of Statistics 2020, Vol.
48, No. 1, 64–85 https://doi.org/10.1214/18-AOS1791 © Institute of Mathematical Statistics …

A concise overview of principal support vector machines and its generalization

J Shin, SJ Shin - Communications for Statistical Applications and …, 2024 - koreascience.kr
In high-dimensional data analysis, sufficient dimension reduction (SDR) has been
considered as an attractive tool for reducing the dimensionality of predictors while …

On post dimension reduction statistical inference

K Kim, B Li, Z Yu, L Li - 2020 - projecteuclid.org
On post dimension reduction statistical inference Page 1 The Annals of Statistics 2020, Vol. 48,
No. 3, 1567–1592 https://doi.org/10.1214/19-AOS1859 © Institute of Mathematical Statistics …

On marginal sliced inverse regression for ultrahigh dimensional model-free feature selection

Z Yu, Y Dong, J Shao - 2016 - projecteuclid.org
Abstract Model-free variable selection has been implemented under the sufficient dimension
reduction framework since the seminal paper of Cook Ann. Statist. 32 (2004) 1062–1092. In …

Asymptotic properties of sufficient dimension reduction with a diverging number of predictors

Y Wu, L Li - Statistica Sinica, 2011 - pmc.ncbi.nlm.nih.gov
We investigate asymptotic properties of a family of sufficient dimension reduction estimators
when the number of predictors p diverges to infinity with the sample size. We adopt a …

[HTML][HTML] Non-convex penalized estimation in high-dimensional models with single-index structure

T Wang, PR Xu, LX Zhu - Journal of Multivariate Analysis, 2012 - Elsevier
As promising alternatives to the LASSO, non-convex penalized methods, such as the SCAD
and the minimax concave penalty method, produce asymptotically unbiased shrinkage …