Overview
Coursera Flash Sale
40% Off Coursera Plus for 3 Months!
Grab it
Learn essential techniques for preventing out-of-memory (OOM) errors when training large language models in this 46-minute tutorial. Discover practical GPU setup strategies and memory optimization methods to maximize your training efficiency. Explore advanced fine-tuning approaches using Unsloth for long-context applications with GPT-OSS models. Master memory management fundamentals that will help you train larger models without running into hardware limitations, covering both theoretical concepts and hands-on implementation strategies for successful model training workflows.
Syllabus
0:00 GPU setup
2:30 How to avoid going OOM
24:20 Fine-tuning GPT-OSS for long context with Unsloth
Taught by
Trelis Research