[BUCH][B] Information geometry and its applications
S Amari - 2016 - books.google.com
This is the first comprehensive book on information geometry, written by the founder of the
field. It begins with an elementary introduction to dualistic geometry and proceeds to a wide …
field. It begins with an elementary introduction to dualistic geometry and proceeds to a wide …
Privacy–security trade-offs in biometric security systems—Part I: Single use case
This is the first part of a two-part paper on the information theoretic study of biometric security
systems. In this paper, the design of single-use biometric security systems is analyzed from …
systems. In this paper, the design of single-use biometric security systems is analyzed from …
The interplay between entropy and variational distance
The relation between the Shannon entropy and variational distance, two fundamental and
frequently-used quantities in information theory, is studied in this paper by means of certain …
frequently-used quantities in information theory, is studied in this paper by means of certain …
On the interplay between conditional entropy and error probability
Fano's inequality relates the error probability of guessing a finitely-valued random variable X
given another random variable Y and the conditional entropy of X given Y. It is not …
given another random variable Y and the conditional entropy of X given Y. It is not …
Entropy bounds for discrete random variables via maximal coupling
I Sason - IEEE Transactions on Information Theory, 2013 - ieeexplore.ieee.org
This paper derives new bounds on the difference of the entropies of two discrete random
variables in terms of the local and total variation distances between their probability mass …
variables in terms of the local and total variation distances between their probability mass …
Directed information on abstract spaces: Properties and variational equalities
Directed information or its variants are utilized extensively in the characterization of the
capacity of channels with memory and feedback, nonanticipative lossy data compression …
capacity of channels with memory and feedback, nonanticipative lossy data compression …
The Rényi capacity and center
B Nakiboğlu - IEEE Transactions on Information Theory, 2018 - ieeexplore.ieee.org
Rényi's information measures-the Rényi information, mean, capacity, radius, and center-are
analyzed relying on the elementary properties of the Rényi divergence and the power …
analyzed relying on the elementary properties of the Rényi divergence and the power …
[HTML][HTML] On the entropy of couplings
In this paper, some general properties of Shannon information measures are investigated
over sets of probability distributions with restricted marginals. Certain optimization problems …
over sets of probability distributions with restricted marginals. Certain optimization problems …
On information divergence measures and a unified typicality
Strong typicality, which is more powerful for theorem proving than weak typicality, can be
applied to finite alphabets only, while weak typicality can be applied to countable alphabets …
applied to finite alphabets only, while weak typicality can be applied to countable alphabets …
Onicescu's informational energy and correlation coefficient in exponential families
F Nielsen - Foundations, 2022 - mdpi.com
The informational energy of Onicescu is a positive quantity that measures the amount of
uncertainty of a random variable. However, contrary to Shannon's entropy, the informational …
uncertainty of a random variable. However, contrary to Shannon's entropy, the informational …