This course walks learners through improving a weak neural network using techniques specific to deep learning, including dropout, early stopping, and batch normalization.
Overview
Syllabus
- Unit 1: Training and Evaluating a Simple Neural Network in PyTorch
- Learning Rate Impact on Training
- Building a Neural Network from Scratch
- Implementing Model Evaluation During Training
- Detecting Overfitting in Neural Networks
- Unit 2: Adding Dropout to Neural Networks in PyTorch
- Adding Dropout to Prevent Overfitting
- Parameterizing Dropout for Flexible Models
- Fixing Dropout Mode for Stable Validation
- Applying Varied Dropout in Deep Networks
- Unit 3: Early Stopping in PyTorch: Preventing Overfitting During Training
- Visualizing the Early Stopping Counter
- Debugging Early Stopping Comparison Logic
- Adding Minimum Delta for Early Stopping
- Saving the Best Model with Checkpoints
- Unit 4: Adding Batch Normalization to Neural Networks in PyTorch
- Adding Batch Normalization to MLP
- Fixing Batch Normalization Dimension Mismatch
- Batch Normalization in Deeper Networks
- Comparing Models With and Without BatchNorm