This course builds on previous neural improvements by introducing learners to advanced optimization techniques like learning rate schedules, optimizer selection, and weight initialization.
Overview
Syllabus
- Unit 1: Learning Rate Scheduling with PyTorch StepLR
- Creating Your First StepLR Scheduler
- Activating the Learning Rate Scheduler
- Building a Complete Training Loop
- Fixing the Learning Rate Scheduler Order
- Unit 2: Comparing SGD and Adam Optimizers in PyTorch
- Setting Up Multiple PyTorch Optimizers
- Resetting Models for Fair Optimizer Comparison
- Implementing Training Loops for Optimizer Comparison
- Optimizers Need Different Learning Rates
- Unit 3: Weight Initialization with Xavier Uniform in PyTorch
- Exploring Default Weight Initialization in PyTorch
- Applying Xavier Initialization to Neural Layers
- Initializing All Layers At Once
- Selective Xavier Initialization for Complex Networks
- Unit 4: Final Challenge: Combining Best Practices to Strengthen Your Neural Network
- Adding Dropout to Fight Overfitting
- Combining Dropout and Batch Normalization
- Implementing Early Stopping for Better Results