Turnitin
降AI改写
早检测系统
早降重系统
Turnitin-UK版
万方检测-期刊版
维普编辑部版
Grammarly检测
Paperpass检测
checkpass检测
PaperYY检测
On the measure of the information in a statistical experiment
J Ginebra - 2007 - projecteuclid.org
Abstract Setting aside experimental costs, the choice of an experiment is usually formulated
in terms of the maximization of a measure of information, often presented as an optimality …
in terms of the maximization of a measure of information, often presented as an optimality …
On local divergences between two probability measures
A broad class of local divergences between two probability measures or between the
respective probability distributions is proposed in this paper. The introduced local …
respective probability distributions is proposed in this paper. The introduced local …
A new family of divergence measures for tests of fit
K Mattheou, A Karagrigoriou - Australian & New Zealand …, 2010 - Wiley Online Library
The aim of this work is to investigate a new family of divergence measures based on the
recently introduced Basu, Harris, Hjort and Jones (BHHJ) measure of divergence …
recently introduced Basu, Harris, Hjort and Jones (BHHJ) measure of divergence …
On reconsidering entropies and divergences and their cumulative counterparts: Csiszar's, DPD's and Fisher's type cumulative and survival measures
K Zografos - Probability in the Engineering and Informational …, 2023 - cambridge.org
This paper concentrates on the fundamental concepts of entropy, information and
divergence to the case where the distribution function and the respective survival function …
divergence to the case where the distribution function and the respective survival function …
On two forms of Fisher's measure of information
T Papaioannou, K Ferentinos - Communications in Statistics …, 2005 - Taylor & Francis
Fisher's information number is the second moment of the “score function” where the
derivative is with respect to x rather than Θ. It is Fisher's information for a location parameter …
derivative is with respect to x rather than Θ. It is Fisher's information for a location parameter …
On Properties of the (Φ, a)-Power Divergence Family with Applications in Goodness of Fit Tests
In this paper we unify the different measures of divergence by introducing a general class of
measures of divergence, the (Φ, a)− power divergence family and investigate its main …
measures of divergence, the (Φ, a)− power divergence family and investigate its main …
Tests of fit for a lognormal distribution
The problem of goodness of fit of a lognormal distribution is usually reduced to testing
goodness of fit of the logarithmic data to a normal distribution. In this paper, new goodness …
goodness of fit of the logarithmic data to a normal distribution. In this paper, new goodness …
Combinatorial information theory: I. philosophical basis of cross-entropy and entropy
RK Niven - arxiv preprint cond-mat/0512017, 2005 - arxiv.org
This study critically analyses the information-theoretic, axiomatic and combinatorial
philosophical bases of the entropy and cross-entropy concepts. The combinatorial basis is …
philosophical bases of the entropy and cross-entropy concepts. The combinatorial basis is …
On Entropy and Divergence Type Measures of Bivariate Extreme Value Copulas: Accepted-July 2024
K Zografos - REVSTAT-Statistical Journal, 2024 - revstat.ine.pt
Pickands dependence function is the basis of extreme value copulas which formulate the
extreme dependence between random variables. Exact forms of cumulative entropy and …
extreme dependence between random variables. Exact forms of cumulative entropy and …
[BOOK][B] Advances in Data Analysis
CH Skiadas - 2010 - Springer
Mathematics Subject Classification (2000): 03E72, 05A10, 05C80, 11B65, 11K45, 37A50,
37E25, 37N40, 58E17, 60A10, 60B12, 60E05, 60E07, 60F05, 60F17, 60G05, 60G15 …
37E25, 37N40, 58E17, 60A10, 60B12, 60E05, 60E07, 60F05, 60F17, 60G05, 60G15 …