PowerBI Data Analyst - Create visualizations and dashboards from scratch
Start speaking a new language. It’s just 3 weeks away.
Overview
Google, IBM & Meta Certificates — All 10,000+ Courses at 40% Off
One annual plan covers every course and certificate on Coursera. 40% off for a limited time.
Get Full Access
Explore a computational framework for quantifying the finite-iteration performance of first-order methods in parametric convex optimization through this 59-minute conference talk by Bartolomeo Stellato from Princeton University. Learn how algorithms can be represented as fixed-length computational graphs to address the challenging problem of determining iteration requirements for high-accuracy solutions in real-time applications. Discover how performance verification is formulated as an optimization problem that maximizes performance metrics like the norm of fixed-point residuals after a fixed number of iterations. Understand how this framework encompasses gradient, projection, and proximal algorithms through affine or piecewise-affine constraints, and examine the theoretical proof that exact verification is NP-hard. Gain insights into strong semidefinite programming relaxations and exact mixed-integer linear formulations that feature tight polyhedral representations of algorithm steps. Explore bound-tightening techniques that leverage operator theory and algorithm iteration structure to improve scalability. Review numerical results demonstrating how this method closely matches true worst-case performance while achieving significant reductions in worst-case fixed-point residuals compared to standard convergence analyses.
Syllabus
Exact Performance Verification of First‑Order Methods in Parametric Convex Optimization
Taught by
Simons Institute