Turnitin
降AI改写
早检测系统
早降重系统
Turnitin-UK版
万方检测-期刊版
维普编辑部版
Grammarly检测
Paperpass检测
checkpass检测
PaperYY检测
[書籍][B] Information theory and network coding
RW Yeung - 2008 - books.google.com
This book is an evolution from my book A First Course in Information Theory published in
2002 when network coding was still at its infancy. The last few years have witnessed the …
2002 when network coding was still at its infancy. The last few years have witnessed the …
Optimal bounds between f-divergences and integral probability metrics
R Agrawal, T Horel - Journal of Machine Learning Research, 2021 - jmlr.org
The families of f-divergences (eg the Kullback-Leibler divergence) and Integral Probability
Metrics (eg total variation distance or maximum mean discrepancies) are widely used to …
Metrics (eg total variation distance or maximum mean discrepancies) are widely used to …
Entropy bounds for discrete random variables via maximal coupling
I Sason - IEEE Transactions on Information Theory, 2013 - ieeexplore.ieee.org
This paper derives new bounds on the difference of the entropies of two discrete random
variables in terms of the local and total variation distances between their probability mass …
variables in terms of the local and total variation distances between their probability mass …
Dependence of integrated, instantaneous, and fluctuating entropy production on the initial state in quantum and classical processes
We consider the additional entropy production (EP) incurred by a fixed quantum or classical
process on some initial state ρ, above the minimum EP incurred by the same process on any …
process on some initial state ρ, above the minimum EP incurred by the same process on any …
[PDF][PDF] Using thermodynamic integration to calculate the posterior probability in Bayesian model selection problems
PM Goggans, Y Chi - AIP Conference Proceedings, 2004 - researchgate.net
This paper gives an algorithm for calculating posterior probabilities using thermodynamic
integration. The thermodynamic integration calculations are accomplished by annealing an …
integration. The thermodynamic integration calculations are accomplished by annealing an …
Entropy characteristics of subsets of states. I
ME Shirokov - Izvestiya: Mathematics, 2006 - iopscience.iop.org
We study the properties of quantum entropy and-capacity (regarded as a function of sets of
quantum states) in the infinite-dimensional case. We obtain conditions for the boundedness …
quantum states) in the infinite-dimensional case. We obtain conditions for the boundedness …
[HTML][HTML] On the entropy of couplings
In this paper, some general properties of Shannon information measures are investigated
over sets of probability distributions with restricted marginals. Certain optimization problems …
over sets of probability distributions with restricted marginals. Certain optimization problems …
An efficient entropy-based stop** rule for mitigating risk factors in supply nets
Potential supply-net risk factors include capacity issues, currency volatility, design changes,
frequent changes in tax regulations, unsafe information systems, and port shutdowns. Such …
frequent changes in tax regulations, unsafe information systems, and port shutdowns. Such …
On the discontinuity of the Shannon information measures
The Shannon information measures are well known to be continuous functions of the
probability distribution for a given finite alphabet. In this paper, however, we show that these …
probability distribution for a given finite alphabet. In this paper, however, we show that these …
On convergence properties of Shannon entropy
FJ Piera, P Parada - Problems of Information Transmission, 2009 - Springer
Convergence properties of Shannon entropy are studied. In the differential setting, it is
known that weak convergence of probability measures (convergence in distribution) is not …
known that weak convergence of probability measures (convergence in distribution) is not …