Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

Some Novel Kernel-Based Divergences Between Probability Distributions

Institute for Pure & Applied Mathematics (IPAM) via YouTube

Overview

Coursera Flash Sale
40% Off Coursera Plus for 3 Months!
Grab it
This lecture presents Anna Korba's research on "Some novel kernel-based divergences between probability distributions" at IPAM's Statistical and Numerical Methods for Non-commutative Optimal Transport Workshop at UCLA. Explore the statistical and geometrical properties of the Kullback-Leibler divergence with kernel covariance operators (KKL), which compares probability distributions through covariance operators in a reproducible kernel Hilbert space rather than using classical density ratios. Discover how Korba addresses a key limitation of the original KKL divergence—its inability to handle distributions with disjoint supports—by proposing a regularized variant that works for all distributions. Learn about the mathematical bounds quantifying the deviation between regularized and original KKL, finite-sample bounds, and closed-form expressions for discrete distributions that make implementation possible. The presentation also covers a Wasserstein gradient descent scheme for the KKL divergence with discrete distributions and empirical studies of its properties for transporting points to target distributions.

Syllabus

Anna Korba - Some novel kernel-based divergences between probability distributions - IPAM at UCLA

Taught by

Institute for Pure & Applied Mathematics (IPAM)

Reviews

Start your review of Some Novel Kernel-Based Divergences Between Probability Distributions

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.