Increasing the gap between descriptional complexity and algorithmic probability
HTML articles powered by AMS MathViewer
- by Adam R. Day
- Trans. Amer. Math. Soc. 363 (2011), 5577-5604
- DOI: https://doi.org/10.1090/S0002-9947-2011-05315-8
- Published electronically: May 5, 2011
- PDF | Request permission
Abstract:
The coding theorem is a fundamental result of algorithmic information theory. A well-known theorem of Gács shows that the analog of the coding theorem fails for continuous sample spaces. This means that descriptional monotonic complexity does not coincide within an additive constant with the negative logarithm of algorithmic probability. Gács’s proof provided a lower bound on the difference between these values. He showed that for infinitely many finite binary strings, this difference was greater than a version of the inverse Ackermann function applied to string length. This paper establishes that this lower bound can be substantially improved. The inverse Ackermann function can be replaced with a function $O(\operatorname {log}(\operatorname {log}(x)))$. This shows that in continuous sample spaces, descriptional monotonic complexity and algorithmic probability are very different. While this proof builds on the original work by Gács, it does have a number of new features; in particular, the algorithm at the heart of the proof works on sets of strings as opposed to individual strings.References
- William C. Calhoun, Degrees of monotone complexity, J. Symbolic Logic 71 (2006), no. 4, 1327–1341. MR 2275862, DOI 10.2178/jsl/1164060458
- G. J. Chaitin, Incompleteness theorems for random reals, Adv. in Appl. Math. 8 (1987), no. 2, 119–146. MR 886921, DOI 10.1016/0196-8858(87)90010-8
- R.G. Downey and D.R. Hirschfeldt, Algorithmic randomness and complexity, Springer-Verlag, to appear.
- Péter Gács, On the relation between descriptional complexity and algorithmic probability, Theoret. Comput. Sci. 22 (1983), no. 1-2, 71–93. MR 693050, DOI 10.1016/0304-3975(83)90139-1
- L. A. Levin, The concept of a random sequence, Dokl. Akad. Nauk SSSR 212 (1973), 548–550 (Russian). MR 0366096
- A. K. Zvonkin and L. A. Levin, The complexity of finite objects and the basing of the concepts of information and randomness on the theory of algorithms, Uspehi Mat. Nauk 25 (1970), no. 6(156), 85–127 (Russian). MR 0307889
- Ming Li and Paul Vitányi, An introduction to Kolmogorov complexity and its applications, 2nd ed., Graduate Texts in Computer Science, Springer-Verlag, New York, 1997. MR 1438307, DOI 10.1007/978-1-4757-2606-0
- André Nies, Computability and randomness, Oxford Logic Guides, vol. 51, Oxford University Press, Oxford, 2009. MR 2548883, DOI 10.1093/acprof:oso/9780199230761.001.0001
- C.-P. Schnorr, Process complexity and effective random tests, J. Comput. System Sci. 7 (1973), 376–388. MR 325366, DOI 10.1016/S0022-0000(73)80030-3
- C.-P. Schnorr, A survey of the theory of random sequences, Basic problems in methodology and linguistics (Proc. Fifth Internat. Congr. Logic, Methodology and Philos. of Sci., Part III, Univ. Western Ontario, London, Ont., 1975) Univ. Western Ontario Ser. Philos. Sci., Vol. 11, Reidel, Dordrecht, 1977, pp. 193–211. MR 0517133
- V. A. Uspensky and A. Shen, Relations between varieties of Kolmogorov complexities, Math. Systems Theory 29 (1996), no. 3, 271–292. MR 1374498, DOI 10.1007/BF01201280
Bibliographic Information
- Adam R. Day
- Affiliation: School of Mathematics, Statistics and Operations Research, Victoria University of Wellington, Wellington 6140, New Zealand
- Email: adam.day@msor.vuw.ac.nz
- Received by editor(s): October 21, 2009
- Received by editor(s) in revised form: February 10, 2010
- Published electronically: May 5, 2011
- © Copyright 2011
American Mathematical Society
The copyright for this article reverts to public domain 28 years after publication. - Journal: Trans. Amer. Math. Soc. 363 (2011), 5577-5604
- MSC (2000): Primary 68Q30
- DOI: https://doi.org/10.1090/S0002-9947-2011-05315-8
- MathSciNet review: 2813425