Parameterization Invariant Representations for Efficient Shape Learning
Erwin Schrödinger International Institute for Mathematics and Physics (ESI) via YouTube
MIT Sloan AI Adoption: Build a Playbook That Drives Real Business ROI
Finance Certifications Goldman Sachs & Amazon Teams Trust
Overview
Google, IBM & Meta Certificates — All 10,000+ Courses at 40% Off
One annual plan covers every course and certificate on Coursera. 40% off for a limited time.
Get Full Access
Explore a 32-minute conference talk from the Thematic Programme on "Infinite-dimensional Geometry: Theory and Applications" at the Erwin Schrödinger International Institute, where a data-driven framework for parameterization invariant representations of 3D graphs and meshes is presented. Discover how the gradient of the varifold norm enables representations that remain invariant to parameterization while maintaining robustness against sampling noise. Learn about the framework's ability to maintain fixed dimensions regardless of input vertex count, making it compatible with conventional neural network architectures for tasks like classification and registration of raw scan data.
Syllabus
Emmanuel Hartman - Parameterization Invariant Representations for Efficient Shape Learning
Taught by
Erwin Schrödinger International Institute for Mathematics and Physics (ESI)