Information-theoretic foundations of mismatched decoding
Shannon's channel coding theorem characterizes the maximal rate of information that can
be reliably transmitted over a communication channel when optimal encoding and decoding …
be reliably transmitted over a communication channel when optimal encoding and decoding …
Finite blocklength lossy source coding for discrete memoryless sources
Shannon propounded a theoretical framework (collectively called information theory) that
uses mathematical tools to understand, model and analyze modern mobile wireless …
uses mathematical tools to understand, model and analyze modern mobile wireless …
Indirect lossy source coding with observed source reconstruction: Nonasymptotic bounds and second-order asymptotics
This paper considers the joint compression of a pair of correlated sources, where the
encoder is allowed to access only one of the sources. The objective is to recover both …
encoder is allowed to access only one of the sources. The objective is to recover both …
Achievable refined asymptotics for successive refinement using Gaussian codebooks
We study the mismatched successive refinement problem where one uses Gaussian
codebooks to compress an arbitrary memoryless source with successive minimum …
codebooks to compress an arbitrary memoryless source with successive minimum …
Gaussian approximation of quantization error for estimation from compressed data
We consider the distributional connection between the lossy compressed representation of a
high-dimensional signal X using a random spherical code and the observation of X under an …
high-dimensional signal X using a random spherical code and the observation of X under an …
Single letter formulas for quantized compressed sensing with Gaussian codebooks
Theoretical and experimental results have shown that compressed sensing with
quantization can perform well if the signal is very sparse, the noise is very low, and the …
quantization can perform well if the signal is very sparse, the noise is very low, and the …
Second-order Asymptotics for Asymmetric Broadcast Channel with non-Gaussian Noise
We study the two-user asymmetric broadcast channel with additive non-Gaussian noise and
derive a second-order achievability rate region when separate error probabilities constraints …
derive a second-order achievability rate region when separate error probabilities constraints …
Exponential strong converse for content identification with lossy recovery
We revisit the high-dimensional content identification with lossy recovery problem (Tuncel
and Gündüz, 2014) and establish an exponential strong converse theorem. As a corollary of …
and Gündüz, 2014) and establish an exponential strong converse theorem. As a corollary of …
The dispersion of mismatched joint source-channel coding for arbitrary sources and additive channels
We consider a joint source channel coding (JSCC) problem in which we desire to transmit
an arbitrary memoryless source over an arbitrary additive channel. We propose a …
an arbitrary memoryless source over an arbitrary additive channel. We propose a …
Joint Data and Semantics Lossy Compression: Nonasymptotic and Second-Order Achievability Bounds
This paper studies a joint data and semantics lossy compression problem in the finite
blocklength regime, where the data and semantic sources are correlated, and only the data …
blocklength regime, where the data and semantic sources are correlated, and only the data …