Hindustan Book Agency 2007; 168 pp; hardcover ISBN10: 8185931755 ISBN13: 9788185931753 List Price: US$34 Member Price: US$27.20 Order Code: HIN/33
 The aim of this little book is to convey three principal developments in the evolution of modern information theory: Shannon's initiation of a revolution in 1948 by his interpretation of Boltzmann entropy as a measure of information yielded by an elementary statistical experiment and basic coding theorems on storing messages and transmitting them through noisy communication channels in an optimal manner; the influence of ergodic theory in the enlargement of the scope of Shannon's theorems through the works of McMillan, Feinstein, Wolfowitz, Breiman and others and its impact on the appearance of the KolmogorovSinai invariant for elementary dynamical systems; and finally, the more recent work of Schumacher, Holevo, Winter and others on the role of von Neumann entropy in the quantum avatar of the basic coding theorems when messages are encoded as quantum states, transmitted through noisy quantum channels, and retrieved by generalized measurements. A publication of Hindustan Book Agency; distributed within the Americas by the American Mathematical Society. Maximum discount of 20% for all commercial channels. Readership Graduate students and research mathematicians interested in applications. Table of Contents  Entropy of elementary information sources
 Stationary information sources
 Communication in the presence of noise
 Quantum coding theorems
 Bibliography
 Index
