PowerBI Data Analyst - Create visualizations and dashboards from scratch
The Most Addictive Python and SQL Courses
Overview
Coursera Flash Sale
40% Off Coursera Plus for 3 Months!
Grab it
Explore kernel learning theory on Riemannian manifolds through this 48-minute conference talk examining the L_2 risk of minimum norm interpolation in reproducing kernel Hilbert spaces (RKHS). Learn about novel theoretical results for kernels defined on closed d-dimensional Riemannian manifolds, requiring only that kernels be trace class and elliptic. Discover how the number of samples, manifold dimension, and kernel properties determine a natural spectral cutoff λ(n,d,K), and understand how minimal norm interpolation learns the projection of data generating processes onto Laplacian eigenfunctions with bounded frequency. Gain insights into sharp L_2 risk bounds with high probability over data, extending previous work on round spheres to general manifold settings. Examine joint research findings that connect statistical learning theory with differential geometry and spectral analysis on manifolds.
Syllabus
Boris Hanin | Kernel Learning on Manifolds
Taught by
Harvard CMSA