Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

What Textbooks Don't Tell You About Curve Fitting

Artem Kirsanov via YouTube

Overview

Coursera Spring Sale
40% Off Coursera Plus Annual!
Grab it
Explore a deep dive into the probabilistic interpretation of linear regression in this 18-minute video by Artem Kirsanov, a graduate student at NYU Center for Neural Science and researcher at Flatiron Institute. Learn how the least squares objective naturally emerges when maximizing the probability of observed data under a model, and understand that the squared term results from assuming Gaussian noise distribution in samples. Discover how incorporating prior beliefs about parameter distributions leads to different regularization techniques in objective functions. The video covers fundamental concepts including what regression is, fitting noise in linear models, deriving least squares, incorporating priors, L2 regularization as Gaussian prior, L1 regularization as Laplace prior, and concludes by synthesizing these concepts into a comprehensive understanding of curve fitting that goes beyond standard textbook explanations.

Syllabus

00:00 Introduction
01:16 What is Regression
02:11 Fitting noise in a linear model
06:02 Deriving Least Squares
07:46 Sponsor: Squarespace
09:04 Incorporating Priors
12:06 L2 regularization as Gaussian Prior
14:30 L1 regularization as Laplace Prior
16:16 Putting all together

Taught by

Artem Kirsanov

Reviews

Start your review of What Textbooks Don't Tell You About Curve Fitting

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.