On the measure of the information in a statistical experiment

J Ginebra - 2007 - projecteuclid.org
Abstract Setting aside experimental costs, the choice of an experiment is usually formulated
in terms of the maximization of a measure of information, often presented as an optimality …

On local divergences between two probability measures

G Avlogiaris, A Micheas, K Zografos - Metrika, 2016 - Springer
A broad class of local divergences between two probability measures or between the
respective probability distributions is proposed in this paper. The introduced local …

A new family of divergence measures for tests of fit

K Mattheou, A Karagrigoriou - Australian & New Zealand …, 2010 - Wiley Online Library
The aim of this work is to investigate a new family of divergence measures based on the
recently introduced Basu, Harris, Hjort and Jones (BHHJ) measure of divergence …

On reconsidering entropies and divergences and their cumulative counterparts: Csiszar's, DPD's and Fisher's type cumulative and survival measures

K Zografos - Probability in the Engineering and Informational …, 2023 - cambridge.org
This paper concentrates on the fundamental concepts of entropy, information and
divergence to the case where the distribution function and the respective survival function …

On two forms of Fisher's measure of information

T Papaioannou, K Ferentinos - Communications in Statistics …, 2005 - Taylor & Francis
Fisher's information number is the second moment of the “score function” where the
derivative is with respect to x rather than Θ. It is Fisher's information for a location parameter …

On Properties of the (Φ, a)-Power Divergence Family with Applications in Goodness of Fit Tests

F Vonta, K Mattheou, A Karagrigoriou - Methodology and Computing in …, 2012 - Springer
In this paper we unify the different measures of divergence by introducing a general class of
measures of divergence, the (Φ, a)− power divergence family and investigate its main …

Tests of fit for a lognormal distribution

A Batsidis, P Economou, G Tzavelas - Journal of Statistical …, 2016 - Taylor & Francis
The problem of goodness of fit of a lognormal distribution is usually reduced to testing
goodness of fit of the logarithmic data to a normal distribution. In this paper, new goodness …

Combinatorial information theory: I. philosophical basis of cross-entropy and entropy

RK Niven - arxiv preprint cond-mat/0512017, 2005 - arxiv.org
This study critically analyses the information-theoretic, axiomatic and combinatorial
philosophical bases of the entropy and cross-entropy concepts. The combinatorial basis is …

On Entropy and Divergence Type Measures of Bivariate Extreme Value Copulas: Accepted-July 2024

K Zografos - REVSTAT-Statistical Journal, 2024 - revstat.ine.pt
Pickands dependence function is the basis of extreme value copulas which formulate the
extreme dependence between random variables. Exact forms of cumulative entropy and …

[BOOK][B] Advances in Data Analysis

CH Skiadas - 2010 - Springer
Mathematics Subject Classification (2000): 03E72, 05A10, 05C80, 11B65, 11K45, 37A50,
37E25, 37N40, 58E17, 60A10, 60B12, 60E05, 60E07, 60F05, 60F17, 60G05, 60G15 …