Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

Udacity

Constructing and Training Neural Networks

via Udacity

Overview

This course covers foundational deep learning theory and practice. We begin with how to think about deep learning and when it is the right tool to use. The course covers the fundamental algorithms of deep learning, deep learning architecture and goals, and interweaves the theory with implementation in PyTorch.

Syllabus

  • Course Overview
    • Explore course objectives, prerequisites, and your instructor as you prepare to build, train, and evaluate neural networks from scratch in deep learning.
  • Introduction to Deep Learning
    • Discover deep learning: its place in AI, how neural networks learn patterns from data, why it's so powerful, and when to use it for complex, large-scale, unstructured data tasks.
  • Building Blocks of Neural Networks
    • Explore how simple artificial neurons combine into layers to form neural networks, enabling machines to learn complex patterns for tasks like image recognition and decision-making.
  • Coding Your First Single-Layer Network
    • Build and understand a single-neuron network using PyTorch; explore how weights and bias create decision boundaries; solve real and logic problems with perceptrons.
  • How Do Networks Learn Non-Linearity
    • Discover how activation functions add non-linearity to neural networks, enabling them to learn complex patterns beyond linear relationships for real-world tasks.
  • Selecting Activation Functions in Practice
    • Explore how activation functions empower neural networks to learn non-linear patterns, compare Sigmoid, Tanh, and ReLU, and understand their limitations in solving complex problems.
  • From Single-layer to Deep Neural Networks
    • Explore how neural networks use layers, depth, and width to learn complex patterns. Learn to design and balance network architectures for effective deep learning solutions.
  • Architecting Multi-layer Neural Networks
    • Learn to build, analyze, and compare multi-layer neural networks (MLPs) in PyTorch, exploring model architecture, hidden layers, and parameter counts with real and classic datasets.
  • How Do Networks Move Data Forward
    • Discover how data moves through neural networks using computational graphs and tensor shapes, enabling you to trace, debug, and refine model predictions with confidence.
  • Following Tensors Through Forward Propagation
    • Learn to control data flow through PyTorch neural networks, implement custom forward passes, and debug architectural errors using shape tracking for robust model building.
  • How Do Networks Measure Error
    • Learn how neural networks use loss functions to measure errors, guide learning, and choose the right function for regression, binary, and multiclass tasks.
  • Investigating Loss Functions In Different Tasks
    • Learn how and why to select and implement appropriate loss functions for regression (MSE) and binary classification (BCE) tasks to ensure effective neural network training.
  • How Do Networks Learn During Training
    • Explore how neural networks learn: gradient descent and backpropagation optimize weights to reduce error, with optimizers like Adam accelerating and stabilizing the training process.
  • Building Your First Training Loop
    • Learn to build a five-step PyTorch training loop, prepare data, train neural networks, and compare optimizers (SGD vs. Adam) for effective model learning.
  • How Do You Prepare Data for Training
    • Learn essential steps to prepare data for machine learning: split datasets, preprocess for quality and optimization, and efficiently load data for reliable model training.
  • Building Efficient Data Loading Pipelines
    • Learn to build efficient data pipelines: clean, encode, scale, batch, and load tabular data for PyTorch models, optimizing preprocessing and DataLoader settings for robust training.
  • How Do You Diagnose Model Performance
    • Learn to diagnose machine learning model performance by identifying underfitting, overfitting, and training instability using the bias-variance tradeoff and loss curves.
  • Plotting and Interpreting Loss Curves
    • Learn to plot and interpret loss curves to diagnose underfitting, overfitting, good fit, and unstable training by comparing training and validation losses during model training.
  • How Do You Evaluate a Model
    • Learn to evaluate machine learning models beyond loss and accuracy by choosing metrics like precision, recall, MAE, and RMSE that align with real-world needs and error costs.
  • Measuring Model Performance Beyond Accuracy
    • Learn to assess model performance using metrics beyond accuracy, using precision, recall, F1, PR and ROC curves to evaluate and optimize models, especially with imbalanced data.
  • How Do You Improve Model Performance
    • Learn a systematic framework to improve deep learning models by diagnosing issues and applying techniques in Data, Model, Optimization, and Inference to boost performance and stability.
  • Tuning Model Performance Across Scenarios
    • Learn to diagnose and address overfitting, underfitting, and instability by applying techniques like dropout, learning rate decay, and systematic model improvement strategies.
  • Moving Beyond Fully-Connected Networks
    • Discover why fully-connected networks struggle with structured data and how specialized architectures like CNNs, RNNs, and Transformers leverage inductive bias for better AI solutions.
  • Diabetes Risk Prediction with PyTorch
    • In this project, you will design a multi-layer perceptron (MLP) to predict diabetes risk using CDC health data, developing a complete workflow from raw data to a tuned and tested deep learning model.

Taught by

Samantha Guerriero

Reviews

4 rating at Udacity based on 1 rating

Start your review of Constructing and Training Neural Networks

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.