Information Theory

Study of the quantification, storage, and communication of information

Information Theory#

Information theory studies the fundamental limits of information processing and communication systems.

Topics#

  • Shannon Entropy
  • Mutual Information
  • Divergence Properties
  • Kraft-McMillan Theorem
  • Asymptotic Equipartition
  • Fano's Inequalities
  • Entropy Rates
  • Shannon Codes
  • Huffman Codes
  • Channel Coding Theorem
  • Source Channel Theorem
  • Polar Codes
  • Differential Entropy
  • Gaussian Channels
  • Elias' Code
  • Arithmetic Codes
  • Tunstall's Code
  • Channel Capacity
  • Shannon's Theorems
  • Block Linear Codes
  • Wonham Filter
  • Viterbi Algorithm
  • Rate Distortion Theory
  • Hamming Code
  • Quantization
  • Rate Distortion Function
  • Sanov's Theorem
  • Chernoff-Stein Lemma
  • Chernoff Information
  • Fisher Information
  • Cramer Rao Inequality
  • Burg's Maximum Entropy Theorem
  • Lempel-Ziv Coding
  • Occam's Razor
  • Kolmogorov's Complexity
  • Slepian-Wolf Encoding
  • Kuhn Tucker
  • Shannon-McMillan-Breiman Theorem
  • Brunn-Minkowski Inequality

Resources#

Books#

  • Elements of Information Theory by Thomas M. Cover, Joy A. Thomas
  • Information Theory, Inference, and Learning Algorithms by David J.C. MacKay
  • Information Theory and Coding by John R. Pierce

Online Resources#

Practice Problems#