Training a Neural Network - Backward Propagation and Gradient Descent
Valerio Velardo - The Sound of AI via YouTube
Power BI Fundamentals - Create visualizations and dashboards from scratch
Learn Generative AI, Prompt Engineering, and LLMs for Free
Overview
Google, IBM & Meta Certificates — All 10,000+ Courses at 40% Off
One annual plan covers every course and certificate on Coursera. 40% off for a limited time.
Get Full Access
Explore the theory and mathematics behind training neural networks through backpropagation and gradient descent in this 22-minute video. Delve into a high-level overview of the process, understand the roles of prediction and error wizards, examine the gradient of the error function, and analyze neural network elements. Learn about gradient descent and its application in optimizing neural networks. Access accompanying slides for visual aid and join The Sound of AI community for further discussions. Gain insights into hiring the presenter as a consultant or connect through various social media platforms for additional resources and networking opportunities.
Syllabus
Introduction
Highlevel overview
Prediction wizard
Error wizard
Gradient of error function
Neural network elements
Gradient descent
Taught by
Valerio Velardo - The Sound of AI