Computational vs Statistical Gaps in Learning and Optimization
Institute for Pure & Applied Mathematics (IPAM) via YouTube
Overview
Syllabus
Sitan Chen - Provably learning a multi-head attention layer - IPAM at UCLA
Jelani Nelson - New local differentially private protocols for frequency and mean estimation
Andrea Montanari - Solving overparametrized systems of nonlinear equations - IPAM at UCLA
Ankur Moitra - Learning from Dynamics - IPAM at UCLA
Surbhi Goel - Beyond Worst-case Guarantees for Sequential Prediction: Robustness via Abstention
Matus Telgarsky - A Perceptron Trio - IPAM at UCLA
Raghu Meka - Complexity of Sparse Linear Regression - IPAM at UCLA
Jelena Diakonikolas - Robust Learning of a Neuron: Bridging Computational Gaps Using Optimization
Vatsal Sharan - Memory as a lens to understand efficient learning and optimization - IPAM at UCLA
Cynthia Rush - Is It Easier to Count Communities Than Find Them? - IPAM at UCLA
Giang Tran - Fast Multipole Attention: A Divide-and-Conquer Attention Mechanism for Long Sequences
Arya Mazumdar - Sample complexity of estimation in logistic regression - IPAM at UCLA
Abhineet Agarwal - Understanding and overcoming the statistical limitations of decision trees
Vasilis Kontonis - Smoothed Analysis for Learning Concepts with Low Intrinsic Dimension
Thien Le - On the hardness of learning under symmetries - IPAM at UCLA
Pravesh Kothari - Algorithms Approaching the Threshold for Semirandom Planted Clique - IPAM at UCLA
Adel Javanmard - Learning from Aggregate Responses - IPAM at UCLA
Omer Reingold - Algorithmic Fairness, Loss Minimization and Outcome Indistinguishability
Ravi Kumar - Learning-Augmented Online Optimization - IPAM at UCLA
Wasim Huleihel - Testing Dependency of Databases - IPAM at UCLA
Pasin Manurangasi - Complex Adversarially Robust Proper Learning of Halfspaces w/ Agnostic Noise
Taught by
Institute for Pure & Applied Mathematics (IPAM)