Stanford EE274: Data Compression - Beyond IID Distributions: Conditional Entropy - Lecture 8
Stanford University via YouTube
Google Data Analytics, IBM AI & Meta Marketing — All in One Subscription
Power BI Fundamentals - Create visualizations and dashboards from scratch
Overview
Google, IBM & Meta Certificates — All 10,000+ Courses at 40% Off
One annual plan covers every course and certificate on Coursera. 40% off for a limited time.
Get Full Access
Explore the concept of conditional entropy and its applications in data compression beyond independent and identically distributed (IID) distributions in this lecture from Stanford University's EE274: Data Compression I course. Delve into advanced topics presented by Professor Tsachy Weissman, along with insights from Shubham Chandak and Pulkit Tandon. Gain a deeper understanding of how conditional entropy extends compression techniques to more complex data structures. Access the course website for supplementary materials and follow along with the comprehensive discussion. For those interested in pursuing the full online course, information on enrollment is available through Stanford's online learning platform.
Syllabus
Stanford EE274: Data Compression I 2023 I Lecture 8 - Beyond IID distributions: Conditional entropy
Taught by
Stanford Online