On the Jensen–Shannon symmetrization of distances relying on abstract means
F Nielsen - Entropy, 2019 - mdpi.com
The Jensen–Shannon divergence is a renowned bounded symmetrization of the
unbounded Kullback–Leibler divergence which measures the total Kullback–Leibler …
unbounded Kullback–Leibler divergence which measures the total Kullback–Leibler …
On integral probability metrics,\phi-divergences and binary classification
BK Sriperumbudur, K Fukumizu, A Gretton… - arxiv preprint arxiv …, 2009 - arxiv.org
A class of distance measures on probabilities--the integral probability metrics (IPMs)--is
addressed: these include the Wasserstein distance, Dudley metric, and Maximum Mean …
addressed: these include the Wasserstein distance, Dudley metric, and Maximum Mean …
The burbea-rao and bhattacharyya centroids
F Nielsen, S Boltz - IEEE Transactions on Information Theory, 2011 - ieeexplore.ieee.org
We study the centroid with respect to the class of information-theoretic Burbea-Rao
divergences that generalize the celebrated Jensen-Shannon divergence by measuring the …
divergences that generalize the celebrated Jensen-Shannon divergence by measuring the …
Positive definite matrices and the S-divergence
S Sra - Proceedings of the American Mathematical Society, 2016 - ams.org
Hermitian positive definite (hpd) matrices form a self-dual convex cone whose interior is a
Riemannian manifold of nonpositive curvature. The manifold view comes with a natural …
Riemannian manifold of nonpositive curvature. The manifold view comes with a natural …
On a variational definition for the Jensen-Shannon symmetrization of distances based on the information radius
F Nielsen - Entropy, 2021 - mdpi.com
We generalize the Jensen-Shannon divergence and the Jensen-Shannon diversity index by
considering a variational definition with respect to a generic mean, thereby extending the …
considering a variational definition with respect to a generic mean, thereby extending the …
Revisiting Chernoff information with likelihood ratio exponential families
F Nielsen - Entropy, 2022 - mdpi.com
The Chernoff information between two probability measures is a statistical divergence
measuring their deviation defined as their maximally skewed Bhattacharyya distance …
measuring their deviation defined as their maximally skewed Bhattacharyya distance …
On Voronoi diagrams on the information-geometric Cauchy manifolds
F Nielsen - Entropy, 2020 - mdpi.com
We study the Voronoi diagrams of a finite set of Cauchy distributions and their dual
complexes from the viewpoint of information geometry by considering the Fisher-Rao …
complexes from the viewpoint of information geometry by considering the Fisher-Rao …
A Unified Learn-to-Distort-Data Framework for Privacy-Utility Trade-off in Trustworthy Federated Learning
In this paper, we first give an introduction to the theoretical basis of the privacy-utility
equilibrium in federated learning based on Bayesian privacy definitions and total variation …
equilibrium in federated learning based on Bayesian privacy definitions and total variation …
Rho-tau bregman information and the geometry of annealing paths
R Brekelmans, F Nielsen - arxiv preprint arxiv:2209.07481, 2022 - arxiv.org
Markov Chain Monte Carlo methods for sampling from complex distributions and estimating
normalization constants often simulate samples from a sequence of intermediate …
normalization constants often simulate samples from a sequence of intermediate …
Skew jensen-bregman voronoi diagrams
A Jensen-Bregman divergence is a distortion measure defined by a Jensen convexity gap
induced by a strictly convex functional generator. Jensen-Bregman divergences unify the …
induced by a strictly convex functional generator. Jensen-Bregman divergences unify the …