Cryptography Seminar

Home     Presentation     Previous years

Alexander Shen


Algorithmic information theory

We often say 'this message does not contain information about smth.' or 'Alice does not know Bob's card' etc., but it is not always easy to give a mathematical meaning to these informal claims. And this can be done in different ways. We discuss this problem and then concentrate on one specific tool that can be used to formalize these notions (at least asymptotically): Kolmogorov complexity and algorithmic information theory. The complexity of a binary string $x$ (or other finite object) is defined as the minimal length of a program that produces $x$; then the mutual information between two objects $x$ and $y$ is defined as a difference between the complexity of the pair $(x,y)$ and the sum of complexities of $x$ and $y$. We discuss the basic results of algorithmic information theory and its connection to Shannon information theory, as well as some open questions.