Introduction to Information Theory - Lecture 1
International Centre for Theoretical Sciences via YouTube
Overview
Coursera Flash Sale
40% Off Coursera Plus for 3 Months!
Grab it
Explore the fundamental concepts of information theory in this lecture delivered as part of the Bangalore School on Statistical Physics-XVI at the International Centre for Theoretical Sciences. Learn the mathematical foundations and core principles that govern the quantification, storage, and communication of information. Discover how information theory bridges statistical physics with communication systems, data compression, and computational complexity. Examine key concepts including entropy, mutual information, and channel capacity through rigorous mathematical treatment suitable for advanced graduate students and researchers. Understand the historical development of information theory from Claude Shannon's groundbreaking work to modern applications in statistical physics and complex systems. Gain insights into how information-theoretic measures provide powerful tools for analyzing physical systems, phase transitions, and emergent phenomena. This pedagogical presentation forms part of a comprehensive advanced-level school designed to bridge master's-level coursework with cutting-edge research topics in statistical physics, making it particularly valuable for Ph.D. students, postdoctoral fellows, and faculty members seeking to incorporate information-theoretic approaches into their research.
Syllabus
Introduction to Information Theory (Lecture 1) by Jaikumar Radhakrishnan
Taught by
International Centre for Theoretical Sciences