Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

Zero To Mastery

Build a Simple Neural Network & Learn Backpropagation

via Zero To Mastery

Overview

Learn about backpropagation and gradient descent by coding your own simple neural network from scratch in Python - no libraries, just fundamentals. Ideal for aspiring Machine Learning Engineers, Data Scientists, and AI Specialists.
  • Coding neural networks from scratch using only Python
  • What backpropagation is and how it helps machines learn
  • How to break down complicated math into simple, doable steps
  • The easiest way to understand gradients and why they matter
  • What’s really happening when a machine makes predictions
  • How to train a smarter model by adjusting tiny details in code

Syllabus

  •   Introduction
    • Introduction
    • Exercise: Meet Your Classmates and Instructor
    • Course Resources
  •   Neural Networks, Derivatives, Gradients, Chain Rule, and Gradient Descent
    • Introduction to Our Simple Neural Network
    • Why We Use Computational Graphs
    • Conducting the Forward Pass
    • Roadmap to Understanding Backpropagation
    • Derivatives Theory
    • Numerical Example of Derivatives
    • Partial Derivatives
    • Gradients
    • Understanding What Partial Derivatives Dо
    • Introduction to Backpropagation
    • (Optional) Chain Rule
    • Gradient Derivation of Mean Squared Error Loss Function
    • Visualizing the Loss Function and Understanding Gradients
    • Using the Chain Rule to See how w2 Affects the Final Loss
    • Backpropagation of w1
    • Introduction to Gradient Descent Visually
    • Gradient Descent
    • Understanding the Learning Rate (Alpha)
    • Moving in the Opposite Direction of the Gradient
    • Calculating Gradient Descent by Hand
    • Coding our Simple Neural Network Part 1
    • Coding our Simple Neural Network Part 2
    • Coding our Simple Neural Network Part 3
    • Coding our Simple Neural Network Part 4
    • Coding our Simple Neural Network Part 5
  •   Implementing Our Advanced Neural Network by Hand + Python
    • Introduction to Our Complex Neural Network
    • Conducting the Forward Pass
    • Getting Started with Backpropagation
    • Getting the Derivative of the Sigmoid Activation Function(Optional)
    • Implementing Backpropagation with the Chain Rule
    • Understanding How w3 Affects the Final Loss
    • Calculating Gradients for Z1
    • Understanding How w1 and w2 Affect the Loss
    • Implementing Gradient Descent by Hand
    • Coding our Advanced Neural Network Part (Implementing Forward Pass + Loss)
    • Coding our Advanced Neural Network Part 2 (Implement Backpropagation)
    • Coding our Advanced Neural Network Part 3 (Implement Gradient Descent)
    • Coding our Advanced Neural Network Part 4 (Training our Neural Network)
  •   Where To Go From Here?
    • Review This Byte!

Taught by

Patrik Szepesi

Reviews

Start your review of Build a Simple Neural Network & Learn Backpropagation

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.