Concentration Functions and Entropy Bounds for Discrete Log-Concave Distributions
Hausdorff Center for Mathematics via YouTube
Future-Proof Your Career: AI Manager Masterclass
Learn EDR Internals: Research & Development From The Masters
Overview
Google, IBM & Meta Certificates — All 10,000+ Courses at 40% Off
One annual plan covers every course and certificate on Coursera. 40% off for a limited time.
Get Full Access
Explore concentration functions and entropy bounds for discrete log-concave distributions in this 44-minute lecture. Delve into two-sided bounds and their applications in deriving variants of entropy power inequalities. Learn about the collaborative research conducted with Arnaud Marsiglietti and James Melbourne, focusing on Renyi entropies within the class of discrete log-concave probability distributions.
Syllabus
Sergey Bobkov: Concentration functions and entropy bounds for discrete log-concave distributions
Taught by
Hausdorff Center for Mathematics