Google, IBM & Meta Certificates — All 10,000+ Courses at 40% Off
One annual plan covers every course and certificate on Coursera. 40% off for a limited time.
Get Full Access
Explore kernel learning theory on Riemannian manifolds through this 48-minute conference talk examining the L_2 risk of minimum norm interpolation in reproducing kernel Hilbert spaces (RKHS). Learn about novel theoretical results for kernels defined on closed d-dimensional Riemannian manifolds, requiring only that kernels be trace class and elliptic. Discover how the number of samples, manifold dimension, and kernel properties determine a natural spectral cutoff λ(n,d,K), and understand how minimal norm interpolation learns the projection of data generating processes onto Laplacian eigenfunctions with bounded frequency. Gain insights into sharp L_2 risk bounds with high probability over data, extending previous work on round spheres to general manifold settings. Examine joint research findings that connect statistical learning theory with differential geometry and spectral analysis on manifolds.