Long Short-Term Memory - LSTM Models with TensorFlow
MLCon | Machine Learning Conference via YouTube
Most AI Pilots Fail to Scale. MIT Sloan Teaches You Why — and How to Fix It
Earn Your Business Degree, Tuition-Free, 100% Online!
Overview
Google, IBM & Meta Certificates — All 10,000+ Courses at 40% Off
One annual plan covers every course and certificate on Coursera. 40% off for a limited time.
Get Full Access
Dive into the world of Long Short-Term Memory (LSTM) models in this code-heavy conference session presented by Sahil Dua at the Machine Learning Conference. Learn how to implement LSTM, a powerful Recurrent Neural Network (RNN) architecture known for its ability to remember values over long intervals and achieve state-of-the-art performance in sequence classification problems. Explore the step-by-step process of writing an LSTM using TensorFlow's Python API for natural language understanding, while gaining insights into the underlying mathematical concepts. Enhance your skills in sequence modeling and deep learning through this comprehensive one-hour talk, designed to provide both practical coding experience and theoretical understanding of LSTM models.
Syllabus
Long Short-Term Memory (LSTM) models with TensorFlow | ML Conference Session Sahil Dua
Taught by
MLCon | Machine Learning Conference