Efficient Aggregated Kernel Tests using Incomplete -statistics
We propose a series of computationally efficient, nonparametric tests for the two-sample,
independence and goodness-of-fit problems, using the Maximum Mean Discrepancy …
independence and goodness-of-fit problems, using the Maximum Mean Discrepancy …
MMD aggregated two-sample test
We propose two novel nonparametric two-sample kernel tests based on the Maximum Mean
Discrepancy (MMD). First, for a fixed kernel, we construct an MMD test using either …
Discrepancy (MMD). First, for a fixed kernel, we construct an MMD test using either …
MMD-FUSE: Learning and combining kernels for two-sample testing without data splitting
We propose novel statistics which maximise the power of a two-sample test based on the
Maximum Mean Discrepancy (MMD), byadapting over the set of kernels used in defining it …
Maximum Mean Discrepancy (MMD), byadapting over the set of kernels used in defining it …
KSD aggregated goodness-of-fit test
We investigate properties of goodness-of-fit tests based on the Kernel Stein Discrepancy
(KSD). We introduce a strategy to construct a test, called KSDAgg, which aggregates …
(KSD). We introduce a strategy to construct a test, called KSDAgg, which aggregates …
Informative features for model comparison
Given two candidate models, and a set of target observations, we address the problem of
measuring the relative goodness of fit of the two models. We propose two new statistical …
measuring the relative goodness of fit of the two models. We propose two new statistical …
Effective nonlinear feature selection method based on hsic lasso and with variational inference
K Koyama, K Kiritoshi, T Okawachi… - International …, 2022 - proceedings.mlr.press
HSIC Lasso is one of the most effective sparse nonlinear feature selection methods based
on the Hilbert-Schmidt independence criterion. We propose an adaptive nonlinear feature …
on the Hilbert-Schmidt independence criterion. We propose an adaptive nonlinear feature …
Conditional selective inference for robust regression and outlier detection using piecewise-linear homotopy continuation
In this paper, we consider conditional selective inference (SI) for a linear model estimated
after outliers are removed from the data. To apply the conditional SI framework, it is …
after outliers are removed from the data. To apply the conditional SI framework, it is …
Kernel sufficient dimension reduction and variable selection for compositional data via amalgamation
Compositional data with a large number of components and an abundance of zeros are
frequently observed in many fields recently. Analyzing such sparse high-dimensional …
frequently observed in many fields recently. Analyzing such sparse high-dimensional …
Valid and exact statistical inference for multi-dimensional multiple change-points by selective inference
In this paper, we study statistical inference of change-points (CPs) in multi-dimensional
sequence. In CP detection from a multi-dimensional sequence, it is often desirable not only …
sequence. In CP detection from a multi-dimensional sequence, it is often desirable not only …
Learning kernel tests without data splitting
Modern large-scale kernel-based tests such as maximum mean discrepancy (MMD) and
kernelized Stein discrepancy (KSD) optimize kernel hyperparameters on a held-out sample …
kernelized Stein discrepancy (KSD) optimize kernel hyperparameters on a held-out sample …