Convolutional Network Back Propagation by Hand - The Math You Should Know

Convolutional Network Back Propagation by Hand - The Math You Should Know

CodeEmporium via YouTube Direct link

8:36 Forward pass with an labeled sample

4 of 18

4 of 18

8:36 Forward pass with an labeled sample

Class Central Classrooms beta

YouTube videos curated by Class Central.

Classroom Contents

Convolutional Network Back Propagation by Hand - The Math You Should Know

Automatically move to the next video in the Classroom when playback concludes

  1. 1 00:00 What & Why back propagation?
  2. 2 2:01 Conv Network we will work with through the video
  3. 3 6:10 High level explanation
  4. 4 8:36 Forward pass with an labeled sample
  5. 5 15:44 Useful math for the backward pass multivariate chain rule
  6. 6 16:53 Backward pass: output layer gradients
  7. 7 23:44 Backward pass: 3rd weight layer gradients
  8. 8 26:32 Backward pass: hidden layer gradients
  9. 9 30:24 Backward pass: 2nd weight layer gradients
  10. 10 31:53 Backward pass: layer p gradients
  11. 11 32:27 Backward pass: pooling gradients
  12. 12 37:05 Backward pass: ReLU gradients
  13. 13 37:05 Backward pass: ReLU gradients
  14. 14 38:37 **IMPORTANT** Backward pass: Convolution gradients feature map gradients
  15. 15 47:15 Updating weights
  16. 16 50:20 pytorch code to verify if hand calculations are correct
  17. 17 51:06 Quiz Time
  18. 18 52:18 Summary & Conclusion

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.