• Module From: Learning Hub

Entropy & Data Compression: The Foundations of Information Overview

Learn about information as data first proposed in Claude Shannon's groundbreaking work; an introduction to the concepts of entropy, data compression, and channels.

Week 1: The Building Blocks of Information
The first week of the entropy module starts the discussion of using bits as a measure of information theory and introduces the key concept of entropy, a value that captures the amount of chaos inherit to a specific random variable. The class then looks at the in encoding messages and two corresponding ideas: Kraft’s Inequality and Shannon’s First Theorem. Finally, the course bounds the efficiency of encoding schemes by calculating entropy and describing the solutions to a few example problems to illustrate the concepts presented in this class.

Week 2: Data Compression
The second week of the entropy module ends theoretical discussion of encoding messages and introduces frequently used methods of data compression. The lectures introduce Huffman Coding and work through several examples, while describing the advantages and limits of the compression scheme. The video briefly mentions Universal Coding as an example of Shannon’s First Theorem before introducing the Lempel-Ziv compression scheme. Upon completion of this week, students should be able to efficiently encode messages based upon the probability distribution of each messages and understand the tradeoff between efficiency and code length.

Week 3: Channels
The third week of the entropy module completes all material covered in the corresponding problem sets through a study of advanced entropy topics. The lectures note that any means of communication contains some probability of error and introduces a simple model of handling error probabilities, the binary symmetric channel. The videos work through examples of problems with binary symmetric channels before describing the differences in asymmetric channels. Finally, the module handles cases where certain information is already known a priori through conditional entropy and covers the similar topics of joint entropy and mutual information, for completeness.