Some universal insights on divergences for statistics, machine learning and artificial intelligence

M Broniatowski, W Stummer - Geometric Structures of Information, 2019 - Springer
Dissimilarity quantifiers such as divergences (eg Kullback–Leibler information, relative
entropy) and distances between probability distributions are widely used in statistics …

Theoretical aspects on measures of directed information with simulations

T Gkelsinis, A Karagrigoriou - Mathematics, 2020 - mdpi.com
Measures of directed information are obtained through classical measures of information by
taking into account specific qualitative characteristics of each event. These measures are …

A unifying framework for some directed distances in statistics

M Broniatowski, W Stummer - Handbook of Statistics, 2022 - Elsevier
Density-based directed distances—particularly known as divergences—between probability
distributions are widely used in statistics as well as in the adjacent research fields of …

A precise bare simulation approach to the minimization of some distances. I. Foundations

M Broniatowski, W Stummer - IEEE Transactions on …, 2022 - ieeexplore.ieee.org
In information theory—as well as in the adjacent fields of statistics, machine learning,
artificial intelligence, signal processing and pattern recognition—many flexibilizations of the …

A characterization of all single-integral, non-kernel divergence estimators

S Jana, A Basu - IEEE Transactions on Information Theory, 2019 - ieeexplore.ieee.org
Divergence measures have been used for a long time for different purposes in information
theory and statistics. In particular, density-based minimum divergence estimation is a …

On reconsidering entropies and divergences and their cumulative counterparts: Csiszár's, DPD's and Fisher's type cumulative and survival measures

K Zografos - Probability in the Engineering and Informational …, 2023 - cambridge.org
This paper concentrates on the fundamental concepts of entropy, information and
divergence to the case where the distribution function and the respective survival function …

Testing local hypotheses with different types of incomplete data

M Boukeloua - TWMS Journal of Applied and Engineering …, 2024 - belgelik.isikun.edu.tr
In this work, we consider a general framework of incomplete data which includes many types
of censoring and truncation models. Under this framework and assuming that the distribution …

Model selection in a composite likelihood framework based on density power divergence

E Castilla, N Martín, L Pardo, K Zografos - Entropy, 2020 - mdpi.com
This paper presents a model selection criterion in a composite likelihood framework based
on density power divergence measures and in the composite minimum density power …

A criterion for local model selection

G Avlogiaris, AC Micheas, K Zografos - Sankhya A, 2019 - Springer
In this paper, we introduce a class of local divergences between two probability distributions
and illustrate its usefulness in model selection. Explicit expressions of the proposed local …

On testing local hypotheses via local divergence

G Avlogiaris, A Micheas, K Zografos - Statistical Methodology, 2016 - Elsevier
The aim of this paper is to propose procedures that test statistical hypotheses locally, that is,
assess the validity of a model in a specific domain of the data. In this context, the one and …