Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

Kernel Learning on Manifolds

Harvard CMSA via YouTube

Overview

Coursera Flash Sale
40% Off Coursera Plus for 3 Months!
Grab it
Explore kernel learning theory on Riemannian manifolds through this 48-minute conference talk examining the L_2 risk of minimum norm interpolation in reproducing kernel Hilbert spaces (RKHS). Learn about novel theoretical results for kernels defined on closed d-dimensional Riemannian manifolds, requiring only that kernels be trace class and elliptic. Discover how the number of samples, manifold dimension, and kernel properties determine a natural spectral cutoff λ(n,d,K), and understand how minimal norm interpolation learns the projection of data generating processes onto Laplacian eigenfunctions with bounded frequency. Gain insights into sharp L_2 risk bounds with high probability over data, extending previous work on round spheres to general manifold settings. Examine joint research findings that connect statistical learning theory with differential geometry and spectral analysis on manifolds.

Syllabus

Boris Hanin | Kernel Learning on Manifolds

Taught by

Harvard CMSA

Reviews

Start your review of Kernel Learning on Manifolds

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.