Digital Revolution (II) - Compression Codes and Technologies
Information and compression
2. Information and compression
The roots of compression lie with the work of the mathematician Claude Shannon (below).
In his now famous paper "The Mathematical Theory of Communication," Shannon investigated the mathematics of how information is sent from one location to another as well as how information is transformed from one format to another . For example, in conveying the information in the music of an orchestra to your ear, you need to convert sounds to electronic signals (using microphones) and then store the information on a tape, long playing record, or CD before retransforming it so that you can listen to it. Shannon's work involved using the concept of a communications channel as a mathematical model or representation of how information gets sent from one location to another. Shannon showed that codes could be designed to correct errors made in sending information over a communications channel. He showed that error-correction codes had to exist and later Richard Hamming (below) produced examples of such codes.
Shannon showed that it was also possible to compress information. He produced examples of such codes which are now known as Shannon-Fano codes. Robert Fano (below) was an electrical engineer at MIT (the son of G. Fano, the Italian mathematician who pioneered the development of finite geometries and for whom the Fano Plane is named).
The Shannon-Fano codes were soon to be overshadowed by the work of David Huffman, a student of Fano's at MIT.
- Information and compression
- Encoding and decoding
- Lossy and non-lossy compression
- Huffman codes
- Compression methods and their applications
- Compression and intellectual property