Information Theory and Its Applications in Combinatorics and Computation
International Centre for Theoretical Sciences via YouTube
Google AI Professional Certificate - Learn AI Skills That Get You Hired
Learn Backend Development Part-Time, Online
Overview
Coursera Flash Sale
40% Off Coursera Plus for 3 Months!
Grab it
Explore the fundamental principles of information theory and discover how these concepts apply to combinatorics and computational problems in this lecture from the Bangalore School on Statistical Physics-XVI. Learn about entropy, mutual information, and channel capacity while examining their powerful applications in solving combinatorial problems and understanding computational complexity. Delve into the mathematical foundations that connect information-theoretic measures to counting problems, graph theory, and algorithmic analysis. Understand how information theory provides elegant tools for proving lower bounds in computational complexity and for analyzing the efficiency of algorithms. Examine specific examples where information-theoretic arguments lead to breakthrough results in combinatorics, including applications to coding theory, communication complexity, and probabilistic methods. Gain insights into how these theoretical frameworks bridge pure mathematics and practical computation, making this lecture valuable for Ph.D. students, postdoctoral fellows, and faculty members working at the intersection of statistical physics, mathematics, and computer science.
Syllabus
Information Theory and its applications in Combinatorics and Computation by Jaikumar Radhakrishnan..
Taught by
International Centre for Theoretical Sciences