Efficient Aggregated Kernel Tests using Incomplete -statistics

A Schrab, I Kim, B Guedj… - Advances in Neural …, 2022 - proceedings.neurips.cc
We propose a series of computationally efficient, nonparametric tests for the two-sample,
independence and goodness-of-fit problems, using the Maximum Mean Discrepancy …

MMD aggregated two-sample test

A Schrab, I Kim, M Albert, B Laurent, B Guedj… - Journal of Machine …, 2023 - jmlr.org
We propose two novel nonparametric two-sample kernel tests based on the Maximum Mean
Discrepancy (MMD). First, for a fixed kernel, we construct an MMD test using either …

MMD-FUSE: Learning and combining kernels for two-sample testing without data splitting

F Biggs, A Schrab, A Gretton - Advances in Neural …, 2024 - proceedings.neurips.cc
We propose novel statistics which maximise the power of a two-sample test based on the
Maximum Mean Discrepancy (MMD), byadapting over the set of kernels used in defining it …

KSD aggregated goodness-of-fit test

A Schrab, B Guedj, A Gretton - Advances in Neural …, 2022 - proceedings.neurips.cc
We investigate properties of goodness-of-fit tests based on the Kernel Stein Discrepancy
(KSD). We introduce a strategy to construct a test, called KSDAgg, which aggregates …

Informative features for model comparison

W Jitkrittum, H Kanagawa, P Sangkloy… - Advances in neural …, 2018 - proceedings.neurips.cc
Given two candidate models, and a set of target observations, we address the problem of
measuring the relative goodness of fit of the two models. We propose two new statistical …

Effective nonlinear feature selection method based on hsic lasso and with variational inference

K Koyama, K Kiritoshi, T Okawachi… - International …, 2022 - proceedings.mlr.press
HSIC Lasso is one of the most effective sparse nonlinear feature selection methods based
on the Hilbert-Schmidt independence criterion. We propose an adaptive nonlinear feature …

Conditional selective inference for robust regression and outlier detection using piecewise-linear homotopy continuation

T Tsukurimichi, Y Inatsu, VNL Duy… - Annals of the Institute of …, 2022 - Springer
In this paper, we consider conditional selective inference (SI) for a linear model estimated
after outliers are removed from the data. To apply the conditional SI framework, it is …

Kernel sufficient dimension reduction and variable selection for compositional data via amalgamation

J Park, J Ahn, C Park - International Conference on Machine …, 2023 - proceedings.mlr.press
Compositional data with a large number of components and an abundance of zeros are
frequently observed in many fields recently. Analyzing such sparse high-dimensional …

Valid and exact statistical inference for multi-dimensional multiple change-points by selective inference

R Sugiyama, H Toda, VNL Duy, Y Inatsu… - arxiv preprint arxiv …, 2021 - arxiv.org
In this paper, we study statistical inference of change-points (CPs) in multi-dimensional
sequence. In CP detection from a multi-dimensional sequence, it is often desirable not only …

Learning kernel tests without data splitting

J Kübler, W Jitkrittum, B Schölkopf… - Advances in Neural …, 2020 - proceedings.neurips.cc
Modern large-scale kernel-based tests such as maximum mean discrepancy (MMD) and
kernelized Stein discrepancy (KSD) optimize kernel hyperparameters on a held-out sample …