Skip to main content

Introductory Information Theory




Overview

 

This short course provides insight into Shannon's methods behind measuring communication channel capacity and reliability including coding, entropy, and multiple channels.

Recommended for those that have had some probability theory/statistics.

Estimated time required: 2 hours per week.

 

Module Author

Syllabus/Suggested Schedule

To view any lecture, just click on them and view it in the player above

 


Week 1: Coding, Entropy and Inequalities

  • Introductory Information Theory - Overview
  • Introductory Information Theory Part 1 - Coding
  • Introductory Information Theory Part 1 - Entropy with Variance
  • Introductory Information Theory Part 1 - Huffman Coding
  • Lempel Ziv 77 Part 1
  • Lempel Ziv 77 Part 2 - C++ Program
  • Introductory Information Theory Part 1 - Shannon's 1st Theorem
  • Introductory Information Theory Part 1 - Kraft Inequality

Week 2: Channels & Information

  • Introductory Information Theory Part 2 - Channels
  • Introductory Information Theory Part 2 - Conditional and Joint Entropy Part 1
  • Introductory Information Theory Part 2 - Conditional and Joint Entropy Part 2
  • Introductory Information Theory Part 2 - Mutual Information
  • Introductory Information Theory Part 2 - Information Theory Textbooks

Copyright © Purdue University, all rights reserved. Purdue University is an equal access/equal opportunity university.

Contact the College of Science at sciencehelp@purdue.edu for trouble accessing this page. Made possible by grant NSF CCF-0939370