Information Theory: Defining Entropy and Information - Lecture 1
University of Oxford via YouTube
-
21
-
- Write review
Earn Your Business Degree, Tuition-Free, 100% Online!
35% Off Finance Skills That Get You Hired - Code CFI35
Overview
Coursera Flash Sale
40% Off Coursera Plus for 3 Months!
Grab it
Explore fundamental concepts of Information Theory in this Oxford Mathematics third-year undergraduate lecture focusing on measuring information content in random variables through entropy. Delve into Sam Cohen's detailed explanation of how to quantify the amount of information gained from observing random variable outcomes, introducing key concepts of entropy and related mathematical quantities. Learn through this 54-minute lecture, which is part of an eight-lecture series, and gain insights into advanced mathematical principles taught at the University of Oxford. Part of the comprehensive Oxford Mathematics curriculum where students typically follow up lectures with small-group tutorial sessions to deepen their understanding through problem-solving and mathematical discussions.
Syllabus
Information Theory: Defining Entropy and Information - Oxford Mathematics 3rd Year Student Lecture
Taught by
Oxford Mathematics
Tags
Reviews
5.0 rating, based on 2 Class Central reviews
Showing Class Central Sort
-
Information theory can often feel overly abstract, but Cohen grounds it immediately in the concept of "Surprise." By moving from the intuitive idea that "rare events are more surprising than common ones" to the formal logarithmic definition of entropy, the lecture makes the transition from logic to calculus feel natural rather than forced.
Unlike many "Introduction to Data Science" courses that treat entropy as a "black box" formula, Oxford’s approach is proof-oriented.
How entropy relates to the geometry of probability distributions via Kullback–Leibler Divergence.
The Chain Rule of Entropy, which is the "F=ma" of information science. -
Excellent understood About information theory and coding basics, coding techniques entropy calculations.