Completed
23:44 Backward pass: 3rd weight layer gradients
Class Central Classrooms beta
YouTube videos curated by Class Central.
Classroom Contents
Convolutional Network Back Propagation by Hand - The Math You Should Know
Automatically move to the next video in the Classroom when playback concludes
- 1 00:00 What & Why back propagation?
- 2 2:01 Conv Network we will work with through the video
- 3 6:10 High level explanation
- 4 8:36 Forward pass with an labeled sample
- 5 15:44 Useful math for the backward pass multivariate chain rule
- 6 16:53 Backward pass: output layer gradients
- 7 23:44 Backward pass: 3rd weight layer gradients
- 8 26:32 Backward pass: hidden layer gradients
- 9 30:24 Backward pass: 2nd weight layer gradients
- 10 31:53 Backward pass: layer p gradients
- 11 32:27 Backward pass: pooling gradients
- 12 37:05 Backward pass: ReLU gradients
- 13 37:05 Backward pass: ReLU gradients
- 14 38:37 **IMPORTANT** Backward pass: Convolution gradients feature map gradients
- 15 47:15 Updating weights
- 16 50:20 pytorch code to verify if hand calculations are correct
- 17 51:06 Quiz Time
- 18 52:18 Summary & Conclusion