Information Theory and Its Applications in Combinatorics and Computation
International Centre for Theoretical Sciences via YouTube
Python, Prompt Engineering, Data Science — Build the Skills Employers Want Now
Get 20% off all career paths from fullstack to AI
Overview
Google, IBM & Meta Certificates — All 10,000+ Courses at 40% Off
One annual plan covers every course and certificate on Coursera. 40% off for a limited time.
Get Full Access
Explore the fundamental principles of information theory and discover how these concepts apply to combinatorics and computational problems in this lecture from the Bangalore School on Statistical Physics-XVI. Learn about entropy, mutual information, and channel capacity while examining their powerful applications in solving combinatorial problems and understanding computational complexity. Delve into the mathematical foundations that connect information-theoretic measures to counting problems, graph theory, and algorithmic analysis. Understand how information theory provides elegant tools for proving lower bounds in computational complexity and for analyzing the efficiency of algorithms. Examine specific examples where information-theoretic arguments lead to breakthrough results in combinatorics, including applications to coding theory, communication complexity, and probabilistic methods. Gain insights into how these theoretical frameworks bridge pure mathematics and practical computation, making this lecture valuable for Ph.D. students, postdoctoral fellows, and faculty members working at the intersection of statistical physics, mathematics, and computer science.
Syllabus
Information Theory and its applications in Combinatorics and Computation by Jaikumar Radhakrishnan..
Taught by
International Centre for Theoretical Sciences