[책][B] An introduction to Kolmogorov complexity and its applications
Ming Li Paul Vitányi Fourth Edition Page 1 An Introduction to Kolmogorov Complexity and Its
Applications Ming Li Paul Vitányi Fourth Edition Texts in Computer Science Page 2 Texts in …
Applications Ming Li Paul Vitányi Fourth Edition Texts in Computer Science Page 2 Texts in …
An operational characterization of mutual information in algorithmic information theory
We show that the mutual information, in the sense of Kolmogorov complexity, of any pair of
strings x and y is equal, up to logarithmic precision, to the length of the longest shared secret …
strings x and y is equal, up to logarithmic precision, to the length of the longest shared secret …
Spectral approach to the communication complexity of multi-party key agreement
G Caillat-Grenier, A Romashchenko - arxiv preprint arxiv:2305.01355, 2023 - arxiv.org
We propose a linear algebraic method, rooted in the spectral properties of graphs, that can
be used to prove lower bounds in communication complexity. Our proof technique effectively …
be used to prove lower bounds in communication complexity. Our proof technique effectively …
Universal almost optimal compression and Slepian-Wolf coding in probabilistic polynomial time
In a lossless compression system with target lengths, a compressor ${\cal C} $ maps an
integer $ m $ and a binary string $ x $ to an $ m $-bit code $ p $, and if $ m $ is sufficiently …
integer $ m $ and a binary string $ x $ to an $ m $-bit code $ p $, and if $ m $ is sufficiently …
27 Open Problems in Kolmogorov Complexity
This formula can be informally read as follows: the ith messagemi brings us log (1= pi)" bits
of information"(whatever this means), and appears with frequency pi, so H is the expected …
of information"(whatever this means), and appears with frequency pi, so H is the expected …
Coding in the fork network in the framework of Kolmogorov complexity
A Romashchenko - arxiv preprint arxiv:1602.02648, 2016 - arxiv.org
Many statements from the classic information theory (the theory of Shannon's entropy) have
natural counterparts in the algorithmic information theory (in the framework of Kolmogorov …
natural counterparts in the algorithmic information theory (in the framework of Kolmogorov …
Optimal probabilistic polynomial time compression and the Slepian-Wolf theorem: tighter version and simple proofs
B Bauwens - arxiv preprint arxiv:1802.00750, 2018 - arxiv.org
We give simplify the proofs of the 2 results in Marius Zimand's paper" Kolmogorov
complexity version of Slepian-Wolf coding, proceedings of STOC 2017, p22--32". The first is …
complexity version of Slepian-Wolf coding, proceedings of STOC 2017, p22--32". The first is …
Universal codes in the shared-randomness model for channels with general distortion capabilities
We put forth new models for universal channel coding. Unlike standard codes which are
designed for a specific type of channel, our most general universal code makes …
designed for a specific type of channel, our most general universal code makes …
Universal almost Optimal Compression and Slepian-wolf Coding in Probabilistic Polynomial Time
In a lossless compression system with target lengths, a compressor 𝒞 maps an integer m
and a binary string x to an m-bit code p, and if m is sufficiently large, a decompressor 𝒟 …
and a binary string x to an m-bit code p, and if m is sufficiently large, a decompressor 𝒟 …