[BUCH][B] Information geometry and its applications

S Amari - 2016 - books.google.com
This is the first comprehensive book on information geometry, written by the founder of the
field. It begins with an elementary introduction to dualistic geometry and proceeds to a wide …

Privacy–security trade-offs in biometric security systems—Part I: Single use case

L Lai, SW Ho, HV Poor - IEEE Transactions on Information …, 2010 - ieeexplore.ieee.org
This is the first part of a two-part paper on the information theoretic study of biometric security
systems. In this paper, the design of single-use biometric security systems is analyzed from …

The interplay between entropy and variational distance

SW Ho, RW Yeung - IEEE Transactions on Information Theory, 2010 - ieeexplore.ieee.org
The relation between the Shannon entropy and variational distance, two fundamental and
frequently-used quantities in information theory, is studied in this paper by means of certain …

On the interplay between conditional entropy and error probability

SW Ho, S Verdú - IEEE Transactions on Information theory, 2010 - ieeexplore.ieee.org
Fano's inequality relates the error probability of guessing a finitely-valued random variable X
given another random variable Y and the conditional entropy of X given Y. It is not …

Entropy bounds for discrete random variables via maximal coupling

I Sason - IEEE Transactions on Information Theory, 2013 - ieeexplore.ieee.org
This paper derives new bounds on the difference of the entropies of two discrete random
variables in terms of the local and total variation distances between their probability mass …

Directed information on abstract spaces: Properties and variational equalities

CD Charalambous, PA Stavrou - IEEE Transactions on …, 2016 - ieeexplore.ieee.org
Directed information or its variants are utilized extensively in the characterization of the
capacity of channels with memory and feedback, nonanticipative lossy data compression …

The Rényi capacity and center

B Nakiboğlu - IEEE Transactions on Information Theory, 2018 - ieeexplore.ieee.org
Rényi's information measures-the Rényi information, mean, capacity, radius, and center-are
analyzed relying on the elementary properties of the Rényi divergence and the power …

[HTML][HTML] On the entropy of couplings

M Kovačević, I Stanojević, V Šenk - Information and Computation, 2015 - Elsevier
In this paper, some general properties of Shannon information measures are investigated
over sets of probability distributions with restricted marginals. Certain optimization problems …

On information divergence measures and a unified typicality

SW Ho, RW Yeung - IEEE Transactions on Information Theory, 2010 - ieeexplore.ieee.org
Strong typicality, which is more powerful for theorem proving than weak typicality, can be
applied to finite alphabets only, while weak typicality can be applied to countable alphabets …

Onicescu's informational energy and correlation coefficient in exponential families

F Nielsen - Foundations, 2022 - mdpi.com
The informational energy of Onicescu is a positive quantity that measures the amount of
uncertainty of a random variable. However, contrary to Shannon's entropy, the informational …