Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

MIT OpenCourseWare

Data Compression and Shannon's Noiseless Coding Theorem - Lecture 16

MIT OpenCourseWare via YouTube

Overview

Coursera Flash Sale
40% Off Coursera Plus for 3 Months!
Grab it
Explore the mathematical foundations of data compression through this lecture from MIT's Principles of Discrete Applied Mathematics course. Begin with the historical development of data compression techniques before diving into the formal definition of first-order sources and the mathematical framework for compression analysis. Learn how entropy serves as a fundamental measure in information theory and discover its crucial role in determining compression limits. Study Shannon's groundbreaking noiseless coding theorem, including its complete mathematical proof, which establishes the theoretical optimal compression ratio achievable for any first-order source. Gain insight into how this theorem provides the mathematical foundation for modern compression algorithms and understand the fundamental limits of lossless data compression from an information-theoretic perspective.

Syllabus

Lecture 16: Data Compression and Shannon’s Noiseless Coding Theorem

Taught by

MIT OpenCourseWare

Reviews

Start your review of Data Compression and Shannon's Noiseless Coding Theorem - Lecture 16

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.