Minimum Entropy of a Log-Concave Random Variable with Fixed Variance
Hausdorff Center for Mathematics via YouTube
MIT Sloan AI Adoption: Build a Playbook That Drives Real Business ROI
Google, IBM & Meta Certificates — 40% Off for a Limited Time
Overview
Google, IBM & Meta Certificates — All 10,000+ Courses at 40% Off
One annual plan covers every course and certificate on Coursera. 40% off for a limited time.
Get Full Access
Explore a mathematical lecture that delves into the minimum entropy properties of log-concave random variables with fixed variance, demonstrating how an exponential random variable achieves this minimum in Shannon differential entropy. Learn about practical applications in deriving upper bounds for additive noise channel capacities with log-concave noise, and discover improved constants in reverse entropy power inequalities for log-concave random variables. Based on collaborative research, gain insights into advanced probability theory and its implications for information theory and channel capacity analysis.
Syllabus
Piotr Nayar: Minimum entropy of a log-concave random variable with fixed variance
Taught by
Hausdorff Center for Mathematics