Stuck in Tutorial Hell? Learn Backend Dev the Right Way
Python, Prompt Engineering, Data Science — Build the Skills Employers Want Now
Overview
Google, IBM & Meta Certificates — All 10,000+ Courses at 40% Off
One annual plan covers every course and certificate on Coursera. 40% off for a limited time.
Get Full Access
Explore a mathematical lecture presenting a novel adaptive regularized Newton-CG algorithm that addresses the fundamental trade-off in nonconvex optimization between global complexity and local convergence rates. Learn about the construction of new regularizers using the conjugate gradient approach with negative curvature monitoring to solve regularized Newton equations. Discover how this adaptive method requires no prior knowledge of the Hessian Lipschitz constant while achieving optimal O(ε^{-3/2}) global complexity for second-order oracle calls and Õ(ε^{-7/4}) complexity for Hessian-vector products. Understand the theoretical foundations behind finding ε-stationary points of nonconvex functions with Lipschitz continuous Hessians, and examine how the proposed algorithm exhibits quadratic local convergence when iterates converge to points where the Hessian is positive definite. Review preliminary numerical results demonstrating the algorithm's competitiveness, including applications to training physics-informed neural networks, bridging the long-standing gap between optimal global performance and superior local convergence rates in nonconvex optimization methods.
Syllabus
Chao Ding: Adaptive Regularized Newton-CG for Nonconvex Optimization: Optimal Global Complexity...
Taught by
BIMSA