Abstract.
We study the rate at which entropy is produced by linear combinations of independent random variables which satisfy a spectral gap condition.
Article PDF
Similar content being viewed by others
References
Artstein, S., Ball, K., Barthe, F., Naor, A.: Solution of Shannon’s Problem on the Monotonicity of Entropy. Submitted, 2002
Bakry, D., Emery, M.: Diffusions hypercontractives. In: Séminaire de Probabilités XIX, number 1123 in Lect. Notes in Math., Springer, 1985, pp. 179–206
Ball, K., Barthe, F., Naor, A.: Entropy jumps in the presence of a spectral gap. Duke Math. J. 119 (1), 41–63 (2003)
Barron, A.R.: Entropy and the central limit theorem. Ann. Probab. 14, 336–342 (1986)
Barron, A.R., Johnson, O.: Fisher information inequalities and the central limit theorem. Preprint, ArXiv:math.PR/0111020
Blachman, N.M.: The convolution inequality for entropy powers. IEEE Trans. Info. Theory 2, 267–271 (1965)
Brown, L.D.: A proof of the central limit theorem motivated by the Cramer-Rao inequality. In: Kalliampur et al., (eds.), Statistics and Probability: Essays in Honor of C. R. Rao, Amsterdam, North-Holland, 1982, pp. 314–328
Carlen, E.A., Soffer, A.: Entropy production by block variable summation and central limit theorem. Commun. Math. Phys. 140 (2), 339–371 (1991)
Csiszar, I.: Informationstheoretische Konvergenzbegriffe im Raum der Wahrscheinlichkeitsverteilungen. Publications of the Mathematical Institute, Hungarian Academy of Sciences, VII, Series A, 137–157 (1962)
Kullback, S.: A lower bound for discrimination information in terms of variation. IEEE Trans. Info. Theory 4, 126–127 (1967)
Linnik, Ju.V.: An information theoretic proof of the central limit theorem with lindeberg conditions. Theory Probab. Appl. 4, 288–299 (1959)
Pinsker, M.: Information and information stability of random variables and processes. Holden-Day, San Francisco, 1964
Shannon, C.E., Weaver, W.: The mathematical theory of communication. University of Illinois Press, Urbana, IL, 1949
Shimizu, R.: On Fisher’s amount of information for location family. In: Patil et al., (eds.), A modern course on statistical distributions in scientific work, Boston, MA, 1974. D. Reidel
Stam, A.J.: Some inequalities satisfied by the quantities of information of Fisher and Shannon. Info. Control 2, 101–112 (1959)
Author information
Authors and Affiliations
Corresponding author
Additional information
Mathematics Subjects Classification (2000):94A17; 60F05
Supported in part by the EU Grant HPMT-CT-2000-00037, The Minkowski center for Geometry and the Israel Science Foundation.
Supported in part by NSF Grant DMS-9796221.
Supported in part by EPSRC Grant GR/R37210.
Supported in part by the BSF, Clore Foundation and EU Grant HPMT-CT-2000-00037.
Rights and permissions
About this article
Cite this article
Artstein, S., Ball, K., Barthe, F. et al. On the rate of convergence in the entropic central limit theorem. Probab. Theory Relat. Fields 129, 381–390 (2004). https://doi.org/10.1007/s00440-003-0329-4
Received:
Revised:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s00440-003-0329-4