Completed
00:00 Introduction
Class Central Classrooms beta
YouTube videos curated by Class Central.
Classroom Contents
Why Neural Networks Are So Deep - AlexNet Explained
Automatically move to the next video in the Classroom when playback concludes
- 1 00:00 Introduction
- 2 00:15 Timeline of neural network research
- 3 01:52 Rise of GPUs and how it helped neural networks
- 4 02:18 ImageNet and hot it helped computer vision research
- 5 02:56 How AlexNet came to be
- 6 04:09 AlexNet architecture and training at a high level
- 7 05:53 ReLU activation and how to removes vanishing gradients
- 8 08:30 Training on multiple GPUs and how it speeds up performance
- 9 10:09 Local Response Normalization to mimic lateral inhibition
- 10 14:27 Overlapping pooling
- 11 15:15 Why do Deep networks overfit?
- 12 15:57 Data Augmentation and how it curbs overfitting
- 13 17:02 Dropout and how it curbs overfitting
- 14 19:32 Putting it all together
- 15 20:24 Quiz Time
- 16 21:22 Summary