Some universal insights on divergences for statistics, machine learning and artificial intelligence
M Broniatowski, W Stummer - Geometric Structures of Information, 2019 - Springer
Dissimilarity quantifiers such as divergences (eg Kullback–Leibler information, relative
entropy) and distances between probability distributions are widely used in statistics …
entropy) and distances between probability distributions are widely used in statistics …
Theoretical aspects on measures of directed information with simulations
Measures of directed information are obtained through classical measures of information by
taking into account specific qualitative characteristics of each event. These measures are …
taking into account specific qualitative characteristics of each event. These measures are …
A unifying framework for some directed distances in statistics
M Broniatowski, W Stummer - Handbook of Statistics, 2022 - Elsevier
Density-based directed distances—particularly known as divergences—between probability
distributions are widely used in statistics as well as in the adjacent research fields of …
distributions are widely used in statistics as well as in the adjacent research fields of …
A precise bare simulation approach to the minimization of some distances. I. Foundations
M Broniatowski, W Stummer - IEEE Transactions on …, 2022 - ieeexplore.ieee.org
In information theory—as well as in the adjacent fields of statistics, machine learning,
artificial intelligence, signal processing and pattern recognition—many flexibilizations of the …
artificial intelligence, signal processing and pattern recognition—many flexibilizations of the …
A characterization of all single-integral, non-kernel divergence estimators
Divergence measures have been used for a long time for different purposes in information
theory and statistics. In particular, density-based minimum divergence estimation is a …
theory and statistics. In particular, density-based minimum divergence estimation is a …
On reconsidering entropies and divergences and their cumulative counterparts: Csiszár's, DPD's and Fisher's type cumulative and survival measures
K Zografos - Probability in the Engineering and Informational …, 2023 - cambridge.org
This paper concentrates on the fundamental concepts of entropy, information and
divergence to the case where the distribution function and the respective survival function …
divergence to the case where the distribution function and the respective survival function …
Testing local hypotheses with different types of incomplete data
M Boukeloua - TWMS Journal of Applied and Engineering …, 2024 - belgelik.isikun.edu.tr
In this work, we consider a general framework of incomplete data which includes many types
of censoring and truncation models. Under this framework and assuming that the distribution …
of censoring and truncation models. Under this framework and assuming that the distribution …
Model selection in a composite likelihood framework based on density power divergence
This paper presents a model selection criterion in a composite likelihood framework based
on density power divergence measures and in the composite minimum density power …
on density power divergence measures and in the composite minimum density power …
A criterion for local model selection
In this paper, we introduce a class of local divergences between two probability distributions
and illustrate its usefulness in model selection. Explicit expressions of the proposed local …
and illustrate its usefulness in model selection. Explicit expressions of the proposed local …
On testing local hypotheses via local divergence
The aim of this paper is to propose procedures that test statistical hypotheses locally, that is,
assess the validity of a model in a specific domain of the data. In this context, the one and …
assess the validity of a model in a specific domain of the data. In this context, the one and …