Dagstuhl-Seminar 03181
Centennial Seminar on Kolmogorov Complexity and Applications
( 27. Apr – 02. May, 2003 )
Permalink
Organisatoren
- Bruno Durand (University of Marseille, FR)
- Leonid A. Levin (Boston University, US)
- Wolfgang Merkle (Universität Heidelberg, DE)
- Alexander Shen (IITP - Moscow, RU)
- Paul M. B. Vitanyi (CWI - Amsterdam, NL)
Kontakt
Algorithmic information theory (Kolmogorov complexity theory) measures the amount of information in a given finite object (bit string, file, message etc.) and formalizes the distinction between highly compressible objects that contain little information (regular objects) and incompressible objects with high information content (random objects). This idea was put forward in 1960's by several researchers, including the famous mathematician, Andrei Nikolaevich Kolmogorov, and led to a fruitful developments. The seminar celebrating 100th birthday anniversary of Kolmogorov, tried to gather the most active people in the field, including some disciples of Kolmogorov, for discussion.
Several active fields of research were covered in the talks:
Relations between computational complexity and descriptional complexity. The idea of taking into account the computation time (needed for decompression) was clear already in 1960's. However, only recently this connection became better understood and interesting relations between complexity classes and time-limited random (incompressible) objects were found. This development could be seen also as finding connections between different notions of randomness (randomness in algorithmic information theory, pseudo-random number generators etc.)
Starting with classical works of Martin-Löf, the notion of algorithmic randomness was closely related to measure theory. Recently it was noted that classical notion of Hausdorff dimension (and similar notions) could be naturally translated to the algorithmic information theory using martingale technique and similar notions.
The first Kolmogorov paper on the subject was called "Three approaches to the definition of the notion of "amount of information", and these approaches were named `combinatorial', `probabilistic' and `algorithmic'. Recently some formal links between these three approaches were noted that allow us to translate some results of algorithmic information theory into combinatorial results and statements about Shannon entropy.
Last but not least there has been a recent development clarifying the distinction between "accidental" information (random noise) and "meaningful information", and how to separate the two. This is a central object of statistics and model selection.
Algorithmic information theory belongs to theoretical computer science and does not claim to be immediately applicable to practice (for example, there is no algorithm to compute Kolmogorov complexity of a given string). However, its ideas act as a sort of inspiration for quite practical applications in learning theory, pattern recognition etc. showing that deep theoretical research becomes useful unexpectedly often.
A comprehensive as well as introductory treatment of Kolmogorov complexity is the monograph by Ming Li and Paul Vitanyi "An Introduction to Kolmogorov Complexity and Its Applications" , Springer-Verlag, New York, 2nd Edition 1997.
- Eric Allender (Rutgers University - Piscataway, US) [dblp]
- Klaus Ambos-Spies (Universität Heidelberg, DE) [dblp]
- Luis Filipe Coelho Antunes (University of Porto, PT)
- Eugene Asarin (University Paris-Diderot, FR) [dblp]
- Veronica Becher (University of Buenos Aires, AR) [dblp]
- Joachim M. Buhmann (ETH Zürich, CH) [dblp]
- Harry Buhrman (CWI - Amsterdam, NL) [dblp]
- Alexey Chernov (IDSIA - Manno, CH)
- Rudi Cilibrasi (CWI - Amsterdam, NL)
- Rodney Downey (Victoria University of Wellington, NZ) [dblp]
- Bruno Durand (University of Marseille, FR)
- Santiago Figueira (University of Buenos Aires, AR)
- Lance Fortnow (University of Chicago, US) [dblp]
- Tobias Gärtner (Universität des Saarlandes, DE)
- Ursula Gather (TU Dortmund, DE)
- Konstantin Gorbunov (IITP - Moscow, RU)
- Yuri Gurevich (Microsoft Research - Redmond, US) [dblp]
- Denis R. Hirschfeldt (University of Chicago, US) [dblp]
- John Hitchcock (University of Wyoming - Laramie, US)
- Günter Hotz (Universität des Saarlandes, DE) [dblp]
- Marcus Hutter (IDSIA - Manno, CH) [dblp]
- Antonin Kucera (Charles University - Prague, CZ)
- Sophie Laplante (Université Paris Sud, FR) [dblp]
- Troy Lee (CWI - Amsterdam, NL) [dblp]
- Jack H. Lutz (Iowa State University, US)
- Elvira Mayordomo (University of Zaragoza, ES) [dblp]
- Wolfgang Merkle (Universität Heidelberg, DE) [dblp]
- Nenad Mihailovic (Universität Heidelberg, DE)
- Andrej A. Muchnik (INT - Moscow, RU)
- André Otfrid Nies (University of Auckland, NZ) [dblp]
- Marc E. Posner (Ohio State University, US)
- Alexandr Rastsvetaev (Culturall - Wien, AT)
- Jan Reimann (Universität Heidelberg, DE) [dblp]
- Jorma Rissanen (HIIT - Helsinki, FI)
- Andrei Romashchenko (IITP - Moscow, RU)
- Boris Ryabko (Russian Academy of Sc. - Novosibirsk, RU)
- Jürgen Schmidhuber (IDSIA - Manno, CH) [dblp]
- Uwe Schöning (Universität Ulm, DE) [dblp]
- Rainer Schuler (Universität Ulm, DE)
- Alexei L. Semenov (INT - Moscow, RU)
- Alexander Shen (IITP - Moscow, RU) [dblp]
- Ludwig Staiger (Universität Halle-Wittenberg, DE)
- Frank Stephan (National University of Singapore, SG) [dblp]
- Sebastiaan A. Terwijn (TU Wien, AT) [dblp]
- Henry Tirri (NOKIA Research Center - Helsinki, FI) [dblp]
- Leen Torenvliet (University of Amsterdam, NL)
- John Tromp (CWI - Amsterdam, NL)
- Maxim Ushakov (Moscow State University, RU)
- Vladimir A. Uspensky (Moscow State University, RU)
- Nikolay K. Vereshchagin (Moscow State University, RU) [dblp]
- Paul M. B. Vitanyi (CWI - Amsterdam, NL)
- Vladimir Viyugin (IITP - Moscow, RU)
- Vladimir Vovk (Royal Holloway University of London, GB)
- Marius Zimand (Towson University, US)
Verwandte Seminare
- Dagstuhl-Seminar 06051: Kolmogorov Complexity and Applications (2006-01-29 - 2006-02-03) (Details)
Schlagworte
- Kolmogorov complexity
- information theory
- randomness
- computational complexity