Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

Computational vs Statistical Gaps in Learning and Optimization

Institute for Pure & Applied Mathematics (IPAM) via YouTube

Overview

Coursera Flash Sale
40% Off Coursera Plus for 3 Months!
Grab it
Explore the fundamental gaps between statistical and computational requirements in machine learning through this comprehensive workshop featuring leading researchers from applied mathematics, statistics, optimization, and theoretical computer science. Delve into the critical distinction between statistical limits (minimum samples needed to solve learning problems) and computational limits (minimum samples required for efficient algorithmic solutions), examining why significant gaps exist between these bounds for important problems like sparse linear regression. Investigate how various constraints including privacy, fairness, interpretability, robustness, and parallelization impact learning costs and complexity. Engage with cutting-edge research presentations covering topics such as multi-head attention learning, differentially private protocols, overparametrized nonlinear systems, sequential prediction robustness, sparse linear regression complexity, robust neuron learning, memory-efficient optimization, community detection, multipole attention mechanisms, logistic regression sample complexity, decision tree limitations, learning under symmetries, semirandom planted clique algorithms, aggregate response learning, algorithmic fairness, learning-augmented optimization, database dependency testing, and adversarially robust halfspace learning with agnostic noise. Gain insights into bridging disciplinary gaps to develop more effective solutions for statistical inference challenges while understanding the theoretical foundations that govern the computational-statistical trade-offs in modern machine learning.

Syllabus

Sitan Chen - Provably learning a multi-head attention layer - IPAM at UCLA
Jelani Nelson - New local differentially private protocols for frequency and mean estimation
Andrea Montanari - Solving overparametrized systems of nonlinear equations - IPAM at UCLA
Ankur Moitra - Learning from Dynamics - IPAM at UCLA
Surbhi Goel - Beyond Worst-case Guarantees for Sequential Prediction: Robustness via Abstention
Matus Telgarsky - A Perceptron Trio - IPAM at UCLA
Raghu Meka - Complexity of Sparse Linear Regression - IPAM at UCLA
Jelena Diakonikolas - Robust Learning of a Neuron: Bridging Computational Gaps Using Optimization
Vatsal Sharan - Memory as a lens to understand efficient learning and optimization - IPAM at UCLA
Cynthia Rush - Is It Easier to Count Communities Than Find Them? - IPAM at UCLA
Giang Tran - Fast Multipole Attention: A Divide-and-Conquer Attention Mechanism for Long Sequences
Arya Mazumdar - Sample complexity of estimation in logistic regression - IPAM at UCLA
Abhineet Agarwal - Understanding and overcoming the statistical limitations of decision trees
Vasilis Kontonis - Smoothed Analysis for Learning Concepts with Low Intrinsic Dimension
Thien Le - On the hardness of learning under symmetries - IPAM at UCLA
Pravesh Kothari - Algorithms Approaching the Threshold for Semirandom Planted Clique - IPAM at UCLA
Adel Javanmard - Learning from Aggregate Responses - IPAM at UCLA
Omer Reingold - Algorithmic Fairness, Loss Minimization and Outcome Indistinguishability
Ravi Kumar - Learning-Augmented Online Optimization - IPAM at UCLA
Wasim Huleihel - Testing Dependency of Databases - IPAM at UCLA
Pasin Manurangasi - Complex Adversarially Robust Proper Learning of Halfspaces w/ Agnostic Noise

Taught by

Institute for Pure & Applied Mathematics (IPAM)

Reviews

Start your review of Computational vs Statistical Gaps in Learning and Optimization

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.