Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

Kurdyka-Łojasiewicz Exponent for Hadamard-Difference-Parameterized Models

Erwin Schrödinger International Institute for Mathematics and Physics (ESI) via YouTube

Overview

Coursera Flash Sale
40% Off Coursera Plus for 3 Months!
Grab it
Explore a 23-minute conference talk from the Workshop on "One World Optimization Seminar in Vienna" held at the Erwin Schrödinger International Institute for Mathematics and Physics (ESI). Delve into L1-regularized optimization problems and their associated smooth "over-parameterized" optimization problems built on the Hadamard difference parametrization (HDP). Discover how second-order stationary points of the HDP-based model correspond to stationary points of the L1-regularized model. Learn about the Kurdyka-Łojasiewicz (KL) exponent of the HDP-based model and how it relates to the L1-regularized model under specific assumptions. Examine the applicability of these concepts to various loss functions commonly used in L1-regularizations, such as least squares and logistic loss functions. Gain insights into how KL exponents can be used to determine the local convergence rate of standard gradient methods for minimizing HDP-based models.

Syllabus

Ting Kei Pong - Kurdyka-Łojasiewicz exponent for a class of Hadamard-difference-parameterized models

Taught by

Erwin Schrödinger International Institute for Mathematics and Physics (ESI)

Reviews

Start your review of Kurdyka-Łojasiewicz Exponent for Hadamard-Difference-Parameterized Models

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.