Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

Stochastic Gradient Descent for SVM - Lecture 21

UofU Data Science via YouTube

Overview

Coursera Flash Sale
40% Off Coursera Plus for 3 Months!
Grab it
Explore a comprehensive lecture on stochastic sub-gradient descent as an optimization strategy for Support Vector Machine (SVM) loss functions. This 1 hour 20 minute session from UofU Data Science examines the effectiveness of this approach and investigates its relationship with the perceptron algorithm. Gain valuable insights into machine learning optimization techniques through practical explanations and theoretical connections. Additional resources are available through the accompanying lecture materials webpage.

Syllabus

Lecture 21: Stochastic Gradient Descent for SVM

Taught by

UofU Data Science

Reviews

Start your review of Stochastic Gradient Descent for SVM - Lecture 21

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.