Inherent tradeoffs in learning fair representations

H Zhao, GJ Gordon - Journal of Machine Learning Research, 2022 - jmlr.org
Real-world applications of machine learning tools in high-stakes domains are often
regulated to be fair, in the sense that the predicted target should satisfy some quantitative …

On the empirical estimation of integral probability metrics

BK Sriperumbudur, K Fukumizu, A Gretton, B Schölkopf… - 2012 - projecteuclid.org
Given two probability measures, P and Q defined on a measurable space, S, the integral
probability metric (IPM) is defined as F (P, Q)=\sup\left {\left | S f\, d PS f\, d Q\right |\,:\, f ∈ …

On integral probability metrics,\phi-divergences and binary classification

BK Sriperumbudur, K Fukumizu, A Gretton… - arxiv preprint arxiv …, 2009 - arxiv.org
A class of distance measures on probabilities--the integral probability metrics (IPMs)--is
addressed: these include the Wasserstein distance, Dudley metric, and Maximum Mean …

On the chi square and higher-order chi distances for approximating f-divergences

F Nielsen, R Nock - IEEE Signal Processing Letters, 2013 - ieeexplore.ieee.org
We report closed-form formula for calculating the Chi square and higher-order Chi distances
between statistical distributions belonging to the same exponential family with affine natural …

[PDF][PDF] Information, Divergence and Risk for Binary Experiments.

MD Reid, RC Williamson - Journal of Machine Learning Research, 2011 - jmlr.org
We unify f-divergences, Bregman divergences, surrogate regret bounds, proper scoring
rules, cost curves, ROC-curves and statistical information. We do this by systematically …

A probabilistic optimal sensor design approach for structural health monitoring using risk-weighted f-divergence

Y Yang, M Chadha, Z Hu, MA Vega, MD Parno… - … Systems and Signal …, 2021 - Elsevier
This paper presents a new approach to optimal sensor design for structural health
monitoring (SHM) applications using a modified f-divergence objective functional. One of the …

Optimal bounds between f-divergences and integral probability metrics

R Agrawal, T Horel - Journal of Machine Learning Research, 2021 - jmlr.org
The families of f-divergences (eg the Kullback-Leibler divergence) and Integral Probability
Metrics (eg total variation distance or maximum mean discrepancies) are widely used to …

Clustering in Hilbert's projective geometry: The case studies of the probability simplex and the elliptope of correlation matrices

F Nielsen, K Sun - Geometric structures of information, 2019 - Springer
Clustering categorical distributions in the probability simplex is a fundamental task met in
many applications dealing with normalized histograms. Traditionally, differential-geometric …

[PDF][PDF] On measures of entropy and information

GE Crooks - Tech. Note, 2017 - threeplusone.com
Broadly speaking, an information measure is any function of one or more probability
distributions. An entropy is an information measure that have units of entropy—negative …

Generalised pinsker inequalities

MD Reid, RC Williamson - arxiv preprint arxiv:0906.1244, 2009 - arxiv.org
We generalise the classical Pinsker inequality which relates variational divergence to
Kullback-Liebler divergence in two ways: we consider arbitrary f-divergences in place of KL …