Turnitin
降AI改写
早检测系统
早降重系统
Turnitin-UK版
万方检测-期刊版
维普编辑部版
Grammarly检测
Paperpass检测
checkpass检测
PaperYY检测
Estimation of entropy and mutual information
L Paninski - Neural computation, 2003 - ieeexplore.ieee.org
We present some new results on the nonparametric estimation of entropy and mutual
information. First, we use an exact local expansion of the entropy function to prove almost …
information. First, we use an exact local expansion of the entropy function to prove almost …
Source coding, large deviations, and approximate pattern matching
We present a development of parts of rate-distortion theory and pattern-matching algorithms
for lossy data compression, centered around a lossy version of the asymptotic equipartition …
for lossy data compression, centered around a lossy version of the asymptotic equipartition …
A hierarchy of information quantities for finite block length analysis of quantum tasks
We consider two fundamental tasks in quantum information theory, data compression with
quantum side information, as well as randomness extraction against quantum side …
quantum side information, as well as randomness extraction against quantum side …
Asymptotic estimates in information theory with non-vanishing error probabilities
VYF Tan - Foundations and Trends® in Communications and …, 2014 - nowpublishers.com
This monograph presents a unified treatment of single-and multi-user problems in
Shannon's information theory where we depart from the requirement that the error …
Shannon's information theory where we depart from the requirement that the error …
On the dispersions of three network information theory problems
We analyze the dispersions of distributed lossless source coding (the Slepian-Wolf
problem), the multiple-access channel, and the asymmetric broadcast channel. For the two …
problem), the multiple-access channel, and the asymmetric broadcast channel. For the two …
Second-order asymptotics in fixed-length source coding and intrinsic randomness
M Hayashi - IEEE Transactions on Information Theory, 2008 - ieeexplore.ieee.org
There is a difference between the optimal rates of fixed-length source coding and intrinsic
randomness when we care about the second-order asymptotics. We prove this difference for …
randomness when we care about the second-order asymptotics. We prove this difference for …
Optimal lossless data compression: Non-asymptotics and asymptotics
This paper provides an extensive study of the behavior of the best achievable rate (and
other related fundamental limits) in variable-length strictly lossless compression. In the non …
other related fundamental limits) in variable-length strictly lossless compression. In the non …
[HTML][HTML] Estimating the entropy of binary time series: Methodology, some theory and a simulation study
Y Gao, I Kontoyiannis, E Bienenstock - Entropy, 2008 - mdpi.com
Partly motivated by entropy-estimation problems in neuroscience, we present a detailed and
extensive comparison between some of the most popular and effective entropy estimation …
extensive comparison between some of the most popular and effective entropy estimation …
Nonasymptotic and second-order achievability bounds for coding with side-information
S Watanabe, S Kuzuoka… - IEEE Transactions on …, 2015 - ieeexplore.ieee.org
We present a novel nonasymptotic or finite blocklength achievability bounds for three side-
information problems in network information theory. These include: 1) the Wyner-Ahlswede …
information problems in network information theory. These include: 1) the Wyner-Ahlswede …
Pointwise redundancy in lossy data compression and universal lossy data compression
I Kontoyiannis - IEEE Transactions on Information Theory, 2000 - ieeexplore.ieee.org
We characterize the achievable pointwise redundancy rates for lossy data compression at a
fixed distortion level." Pointwise redundancy" refers to the difference between the description …
fixed distortion level." Pointwise redundancy" refers to the difference between the description …