Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

Adaptive Regularized Newton-CG for Nonconvex Optimization - Optimal Global Complexity and Quadratic Local Convergence

BIMSA via YouTube

Overview

Coursera Flash Sale
40% Off Coursera Plus for 3 Months!
Grab it
Explore a mathematical lecture presenting a novel adaptive regularized Newton-CG algorithm that addresses the fundamental trade-off in nonconvex optimization between global complexity and local convergence rates. Learn about the construction of new regularizers using the conjugate gradient approach with negative curvature monitoring to solve regularized Newton equations. Discover how this adaptive method requires no prior knowledge of the Hessian Lipschitz constant while achieving optimal O(ε^{-3/2}) global complexity for second-order oracle calls and Õ(ε^{-7/4}) complexity for Hessian-vector products. Understand the theoretical foundations behind finding ε-stationary points of nonconvex functions with Lipschitz continuous Hessians, and examine how the proposed algorithm exhibits quadratic local convergence when iterates converge to points where the Hessian is positive definite. Review preliminary numerical results demonstrating the algorithm's competitiveness, including applications to training physics-informed neural networks, bridging the long-standing gap between optimal global performance and superior local convergence rates in nonconvex optimization methods.

Syllabus

Chao Ding: Adaptive Regularized Newton-CG for Nonconvex Optimization: Optimal Global Complexity...

Taught by

BIMSA

Reviews

Start your review of Adaptive Regularized Newton-CG for Nonconvex Optimization - Optimal Global Complexity and Quadratic Local Convergence

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.