Combinatorial and Geometric Challenges in PAC Learning with Partial Concepts
Institute for Advanced Study via YouTube
Overview
Coursera Flash Sale
40% Off Coursera Plus for 3 Months!
Grab it
Explore a recent extension of the classical PAC (Probably Approximately Correct) learning framework in this 53-minute computer science seminar. Discover how moving from total function classes to partial function classes enables modeling of practical data-dependent assumptions such as margin conditions and low-dimensional structure, where functions may be undefined in certain regions reflecting assumptions about data distribution. Learn why this perspective reveals fundamentally different phenomena from classical PAC models, including how the traditional empirical risk minimization (ERM) principle—central to classical PAC learning—fails for general partial function classes. Examine the broader question of how much more expressive partial functions are compared to traditional total functions, leading to new open problems at the intersection of combinatorics, geometry, and topology. Gain insights into these mathematical challenges without requiring prior knowledge in learning theory, as the presentation builds concepts from foundational principles while addressing cutting-edge research questions in computational learning theory.
Syllabus
Combinatorial and Geometric Challenges in PAC Learning with Partial Concepts - Shay Moran
Taught by
Institute for Advanced Study