Inherent tradeoffs in learning fair representations
Real-world applications of machine learning tools in high-stakes domains are often
regulated to be fair, in the sense that the predicted target should satisfy some quantitative …
regulated to be fair, in the sense that the predicted target should satisfy some quantitative …
On the empirical estimation of integral probability metrics
Given two probability measures, P and Q defined on a measurable space, S, the integral
probability metric (IPM) is defined as F (P, Q)=\sup\left {\left | S f\, d PS f\, d Q\right |\,:\, f ∈ …
probability metric (IPM) is defined as F (P, Q)=\sup\left {\left | S f\, d PS f\, d Q\right |\,:\, f ∈ …
On integral probability metrics,\phi-divergences and binary classification
A class of distance measures on probabilities--the integral probability metrics (IPMs)--is
addressed: these include the Wasserstein distance, Dudley metric, and Maximum Mean …
addressed: these include the Wasserstein distance, Dudley metric, and Maximum Mean …
On the chi square and higher-order chi distances for approximating f-divergences
We report closed-form formula for calculating the Chi square and higher-order Chi distances
between statistical distributions belonging to the same exponential family with affine natural …
between statistical distributions belonging to the same exponential family with affine natural …
[PDF][PDF] Information, Divergence and Risk for Binary Experiments.
We unify f-divergences, Bregman divergences, surrogate regret bounds, proper scoring
rules, cost curves, ROC-curves and statistical information. We do this by systematically …
rules, cost curves, ROC-curves and statistical information. We do this by systematically …
A probabilistic optimal sensor design approach for structural health monitoring using risk-weighted f-divergence
This paper presents a new approach to optimal sensor design for structural health
monitoring (SHM) applications using a modified f-divergence objective functional. One of the …
monitoring (SHM) applications using a modified f-divergence objective functional. One of the …
Optimal bounds between f-divergences and integral probability metrics
R Agrawal, T Horel - Journal of Machine Learning Research, 2021 - jmlr.org
The families of f-divergences (eg the Kullback-Leibler divergence) and Integral Probability
Metrics (eg total variation distance or maximum mean discrepancies) are widely used to …
Metrics (eg total variation distance or maximum mean discrepancies) are widely used to …
Clustering in Hilbert's projective geometry: The case studies of the probability simplex and the elliptope of correlation matrices
Clustering categorical distributions in the probability simplex is a fundamental task met in
many applications dealing with normalized histograms. Traditionally, differential-geometric …
many applications dealing with normalized histograms. Traditionally, differential-geometric …
[PDF][PDF] On measures of entropy and information
GE Crooks - Tech. Note, 2017 - threeplusone.com
Broadly speaking, an information measure is any function of one or more probability
distributions. An entropy is an information measure that have units of entropy—negative …
distributions. An entropy is an information measure that have units of entropy—negative …
Generalised pinsker inequalities
We generalise the classical Pinsker inequality which relates variational divergence to
Kullback-Liebler divergence in two ways: we consider arbitrary f-divergences in place of KL …
Kullback-Liebler divergence in two ways: we consider arbitrary f-divergences in place of KL …