Master AI & Data—50% Off Udacity (Code CC50)
Google AI Professional Certificate - Learn AI Skills That Get You Hired
Overview
Coursera Flash Sale
40% Off Coursera Plus for 3 Months!
Grab it
Explore a computational framework for quantifying the finite-iteration performance of first-order methods in parametric convex optimization through this 59-minute conference talk by Bartolomeo Stellato from Princeton University. Learn how algorithms can be represented as fixed-length computational graphs to address the challenging problem of determining iteration requirements for high-accuracy solutions in real-time applications. Discover how performance verification is formulated as an optimization problem that maximizes performance metrics like the norm of fixed-point residuals after a fixed number of iterations. Understand how this framework encompasses gradient, projection, and proximal algorithms through affine or piecewise-affine constraints, and examine the theoretical proof that exact verification is NP-hard. Gain insights into strong semidefinite programming relaxations and exact mixed-integer linear formulations that feature tight polyhedral representations of algorithm steps. Explore bound-tightening techniques that leverage operator theory and algorithm iteration structure to improve scalability. Review numerical results demonstrating how this method closely matches true worst-case performance while achieving significant reductions in worst-case fixed-point residuals compared to standard convergence analyses.
Syllabus
Exact Performance Verification of First‑Order Methods in Parametric Convex Optimization
Taught by
Simons Institute