Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

CodeSignal

Foundations of Optimization Algorithms

via CodeSignal

Overview

Optimization is critical in machine learning to minimize loss functions. This course covers basic to advanced optimization algorithms, equipping you with the techniques needed to fine-tune machine learning models.

Syllabus

  • Unit 1: Newton's Method for Optimization
    • Optimize the Quadratic Function using Newton's Method
    • Minimize and Plot Optimization Path using Newton's Method
    • Minimize Function and Plot Optimization Paths from Different Initial Guesses
    • Minimize or Maximize?
  • Unit 2: Basic Gradient Descent
    • Finding Minimum of a Complex Function Using Gradient Descent
    • Changing Starting Points in Gradient Descent
    • Experimenting with Learning Rate in Gradient Descent
    • Minimize a 3-Variable Function Using Gradient Descent
    • Implement Gradient Descent with Tolerance Stopping Criterion
  • Unit 3: Gradient Descent with Momentum
    • Applying Momentum in Gradient Descent
    • Gradient Descent with Momentum: Minimize and Plot Contour
    • Adjust Momentum to Observe Convergence Speed
    • Plotting Gradient Descent with Momentum
    • Gradient Descent with Momentum from Multiple Initial Points
  • Unit 4: Adaptive Learning Rate Methods
    • Implementing Adagrad for Function Optimization
    • Optimization Paths using Adagrad from Multiple Initial Points
    • Minimize and Plot Paths with Adagrad and Gradient with Momentum

Reviews

Start your review of Foundations of Optimization Algorithms

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.