Density-ratio matching under the bregman divergence: a unified framework of density-ratio estimation

M Sugiyama, T Suzuki, T Kanamori - Annals of the Institute of Statistical …, 2012 - Springer
Estimation of the ratio of probability densities has attracted a great deal of attention since it
can be used for addressing various statistical paradigms. A naive approach to density-ratio …

Optimal kernel choice for large-scale two-sample tests

A Gretton, D Sejdinovic, H Strathmann… - Advances in neural …, 2012 - proceedings.neurips.cc
Given samples from distributions $ p $ and $ q $, a two-sample test determines whether to
reject the null hypothesis that $ p= q $, based on the value of a test statistic measuring the …

Change-point detection in time-series data by relative density-ratio estimation

S Liu, M Yamada, N Collier, M Sugiyama - Neural Networks, 2013 - Elsevier
The objective of change-point detection is to discover abrupt property changes lying behind
time-series data. In this paper, we present a novel statistical change-point detection …

[LIBRO][B] Machine learning in non-stationary environments: Introduction to covariate shift adaptation

M Sugiyama, M Kawanabe - 2012 - books.google.com
Theory, algorithms, and applications of machine learning techniques to overcome" covariate
shift" non-stationarity. As the power of computing has grown over the past few decades, the …

Generative models and model criticism via optimized maximum mean discrepancy

DJ Sutherland, HY Tung, H Strathmann, S De… - arxiv preprint arxiv …, 2016 - arxiv.org
We propose a method to optimize the representation and distinguishability of samples from
two probability distributions, by maximizing the estimated power of a statistical test based on …

Guiding new physics searches with unsupervised learning

A De Simone, T Jacques - The European Physical Journal C, 2019 - Springer
We propose a new scientific application of unsupervised learning techniques to boost our
ability to search for new phenomena in data, by detecting discrepancies between two …

Maximum mean discrepancy test is aware of adversarial attacks

R Gao, F Liu, J Zhang, B Han, T Liu… - International …, 2021 - proceedings.mlr.press
The maximum mean discrepancy (MMD) test could in principle detect any distributional
discrepancy between two datasets. However, it has been shown that the MMD test is …

Machine learning with squared-loss mutual information

M Sugiyama - Entropy, 2012 - mdpi.com
Mutual information (MI) is useful for detecting statistical independence between random
variables, and it has been successfully applied to solving various machine learning …

Relative density-ratio estimation for robust distribution comparison

M Yamada, T Suzuki, T Kanamori, H Hachiya… - Neural …, 2013 - ieeexplore.ieee.org
Divergence estimators based on direct approximation of density ratios without going through
separate approximation of numerator and denominator densities have been successfully …

Detecting abnormal situations using the Kullback–Leibler divergence

J Zeng, U Kruger, J Geluk, X Wang, L **e - Automatica, 2014 - Elsevier
This article develops statistics based on the Kullback–Leibler (KL) divergence to monitor
large-scale technical systems. These statistics detect anomalous system behavior by …