Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

To Intrinsic Dimension and Beyond - Efficient Sampling in Diffusion Models

Harvard CMSA via YouTube

Overview

Coursera Flash Sale
40% Off Coursera Plus for 3 Months!
Grab it
Explore the theoretical foundations of diffusion models in this 47-minute conference talk from Harvard CMSA's Workshop on Mathematical Foundations of AI. Delve into the mathematical analysis of denoising diffusion probabilistic models (DDPM) and discover how these cornerstone generative AI models achieve practical efficiency despite conservative theoretical bounds. Learn about breakthrough research demonstrating that DDPM iteration complexity scales nearly linearly with intrinsic dimension k for broad classes of data distributions, providing optimal performance under the KL divergence metric. Examine the case of Gaussian mixture distributions where DDPM achieves logarithmic scaling in the number of components, offering theoretical justification for the remarkable practical efficiency observed in diffusion models. Gain insights into how these models automatically exploit intrinsic low-dimensionality of data to achieve significant sampling speed-ups, bridging the gap between theory and practice in generative modeling.

Syllabus

Yuting Wei | To Intrinsic Dimension and Beyond: Efficient Sampling in Diffusion Models

Taught by

Harvard CMSA

Reviews

Start your review of To Intrinsic Dimension and Beyond - Efficient Sampling in Diffusion Models

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.