Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

Adam Optimizer from Scratch in Python

Yacine Mahdid via YouTube

Overview

Coursera Flash Sale
40% Off Coursera Plus for 3 Months!
Grab it
Learn to implement the Adam optimizer from scratch using Python and NumPy in this 15-minute tutorial that demystifies one of the most popular optimization algorithms used in deep neural network training. Begin with an introduction to the Adam optimizer and its significance in machine learning, then dive into the theoretical foundations that make this algorithm effective. Follow along with a detailed formula walkthrough that breaks down the mathematical components step by step, making the concepts accessible even for those new to optimization algorithms. Progress to the hands-on numpy implementation section where you'll code the Adam optimizer from the ground up, discovering that the implementation is more straightforward than you might expect. Conclude by seeing the optimizer in action and understanding why this powerful tool has become a standard choice for training neural networks across various applications.

Syllabus

- introduction: 0:00
- theory: 0:35
- formula walkthrough 1:48
- numpy implementation: 10:28
- success: 14:10
- there isnt much to it: 14:30

Taught by

Yacine Mahdid

Reviews

Start your review of Adam Optimizer from Scratch in Python

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.