Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

CodeSignal

Advanced Neural Tuning

via CodeSignal

Overview

This course builds on previous neural improvements by introducing learners to advanced optimization techniques like learning rate schedules, optimizer selection, and weight initialization.

Syllabus

  • Unit 1: Learning Rate Scheduling with PyTorch StepLR
    • Creating Your First StepLR Scheduler
    • Activating the Learning Rate Scheduler
    • Building a Complete Training Loop
    • Fixing the Learning Rate Scheduler Order
  • Unit 2: Comparing SGD and Adam Optimizers in PyTorch
    • Setting Up Multiple PyTorch Optimizers
    • Resetting Models for Fair Optimizer Comparison
    • Implementing Training Loops for Optimizer Comparison
    • Optimizers Need Different Learning Rates
  • Unit 3: Weight Initialization with Xavier Uniform in PyTorch
    • Exploring Default Weight Initialization in PyTorch
    • Applying Xavier Initialization to Neural Layers
    • Initializing All Layers At Once
    • Selective Xavier Initialization for Complex Networks
  • Unit 4: Final Challenge: Combining Best Practices to Strengthen Your Neural Network
    • Adding Dropout to Fight Overfitting
    • Combining Dropout and Batch Normalization
    • Implementing Early Stopping for Better Results

Reviews

Start your review of Advanced Neural Tuning

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.