Gain a Splash of New Skills - Coursera+ Annual Nearly 45% Off
Master AI & Data—50% Off Udacity (Code CC50)
Overview
Coursera Flash Sale
40% Off Coursera Plus for 3 Months!
Grab it
Learn to implement automatic differentiation from scratch in Python by building your own autograd system similar to PyTorch's core functionality. Explore the theoretical foundations of automatic differentiation and understand how computational graphs enable efficient gradient computation through backpropagation. Master the step-by-step process of creating autograd operations for fundamental mathematical functions including addition, multiplication, and ReLU activation, then implement the backward pass algorithm that automatically computes gradients. Gain deep insights into the mechanics behind modern deep learning frameworks by constructing a working autograd library that demonstrates how neural networks efficiently calculate derivatives for optimization.
Syllabus
- Introduction: 0:00
- Automatic differentiation theory: 0:53
- Computational graph definition: 3:30
- Backpropagation with automatic differentiation: 7:04
- Autograd from scratch in Python: 10:44
- Autograd - addition: 13:52
- Autograd - multiplication: 18:34
- Autograd - ReLU: 20:44
- Autograd - backward: 22:40
- Recap: 26:13
Taught by
Yacine Mahdid