Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

Backpropagation with Automatic Differentiation from Scratch in Python

Yacine Mahdid via YouTube

Overview

Coursera Flash Sale
40% Off Coursera Plus for 3 Months!
Grab it
Learn to implement automatic differentiation from scratch in Python by building your own autograd system similar to PyTorch's core functionality. Explore the theoretical foundations of automatic differentiation and understand how computational graphs enable efficient gradient computation through backpropagation. Master the step-by-step process of creating autograd operations for fundamental mathematical functions including addition, multiplication, and ReLU activation, then implement the backward pass algorithm that automatically computes gradients. Gain deep insights into the mechanics behind modern deep learning frameworks by constructing a working autograd library that demonstrates how neural networks efficiently calculate derivatives for optimization.

Syllabus

- Introduction: 0:00
- Automatic differentiation theory: 0:53
- Computational graph definition: 3:30
- Backpropagation with automatic differentiation: 7:04
- Autograd from scratch in Python: 10:44
- Autograd - addition: 13:52
- Autograd - multiplication: 18:34
- Autograd - ReLU: 20:44
- Autograd - backward: 22:40
- Recap: 26:13

Taught by

Yacine Mahdid

Reviews

Start your review of Backpropagation with Automatic Differentiation from Scratch in Python

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.