Turnitin
降AI改写
早检测系统
早降重系统
Turnitin-UK版
万方检测-期刊版
维普编辑部版
Grammarly检测
Paperpass检测
checkpass检测
PaperYY检测
Kernel mean embedding of distributions: A review and beyond
A Hilbert space embedding of a distribution—in short, a kernel mean embedding—has
recently emerged as a powerful tool for machine learning and statistical inference. The basic …
recently emerged as a powerful tool for machine learning and statistical inference. The basic …
Examples are not enough, learn to criticize! criticism for interpretability
Example-based explanations are widely used in the effort to improve the interpretability of
highly complex distributions. However, prototypes alone are rarely sufficient to represent the …
highly complex distributions. However, prototypes alone are rarely sufficient to represent the …
Near-linear time approximation algorithms for optimal transport via Sinkhorn iteration
Computing optimal transport distances such as the earth mover's distance is a fundamental
problem in machine learning, statistics, and computer vision. Despite the recent introduction …
problem in machine learning, statistics, and computer vision. Despite the recent introduction …
Revisiting classifier two-sample tests
The goal of two-sample tests is to assess whether two samples, $ S_P\sim P^ n $ and $
S_Q\sim Q^ m $, are drawn from the same distribution. Perhaps intriguingly, one relatively …
S_Q\sim Q^ m $, are drawn from the same distribution. Perhaps intriguingly, one relatively …
A kernel test of goodness of fit
K Chwialkowski, H Strathmann… - … conference on machine …, 2016 - proceedings.mlr.press
We propose a nonparametric statistical test for goodness-of-fit: given a set of samples, the
test determines how likely it is that these were generated from a target density function. The …
test determines how likely it is that these were generated from a target density function. The …
Making tree ensembles interpretable: A bayesian model selection approach
Tree ensembles, such as random forests, are renowned for their high prediction
performance. However, their interpretability is critically limited due to the enormous …
performance. However, their interpretability is critically limited due to the enormous …
Generative models and model criticism via optimized maximum mean discrepancy
We propose a method to optimize the representation and distinguishability of samples from
two probability distributions, by maximizing the estimated power of a statistical test based on …
two probability distributions, by maximizing the estimated power of a statistical test based on …
One-network adversarial fairness
There is currently a great expansion of the impact of machine learning algorithms on our
lives, prompting the need for objectives other than pure performance, including fairness …
lives, prompting the need for objectives other than pure performance, including fairness …
Fast two-sample testing with analytic representations of probability measures
KP Chwialkowski, A Ramdas… - Advances in Neural …, 2015 - proceedings.neurips.cc
We propose a class of nonparametric two-sample tests with a cost linear in the sample size.
Two tests are given, both based on an ensemble of distances between analytic functions …
Two tests are given, both based on an ensemble of distances between analytic functions …
Interpretable distribution features with maximum testing power
Two semimetrics on probability distributions are proposed, given as the sum of differences of
expectations of analytic functions evaluated at spatial or frequency locations (ie, features) …
expectations of analytic functions evaluated at spatial or frequency locations (ie, features) …