Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

Scaling Down, Powering Up - Can Efficient Training Beat Scaling Laws?

MLOps World: Machine Learning in Production via YouTube

Overview

Coursera Flash Sale
40% Off Coursera Plus for 3 Months!
Grab it
Explore cutting-edge strategies for training efficient language models that challenge traditional scaling paradigms in this comprehensive conference presentation. Discover how innovative data-centric and model-centric approaches can achieve superior AI performance without massive computational costs, using DeepSeek's success as a prime example of thoughtful engineering over brute-force scaling. Learn about the rise of small language models (SLMs) as cost-effective alternatives to dense large language models, and master data enhancement techniques including mixing, filtering, and deduplication to improve dataset quality. Dive into advanced model optimization methods such as pruning, distillation, parameter-efficient fine-tuning, quantization, and model merging that streamline architectures while maintaining performance. Understand how strategic data preparation and intelligent model design can produce superior language models without the prohibitive financial investments traditionally associated with scaling AI systems, demonstrating that efficiency and thoughtful engineering can outperform raw computational power in modern machine learning applications.

Syllabus

Scaling Down, Powering Up: Can Efficient Training Beat Scaling Laws?

Taught by

MLOps World: Machine Learning in Production

Reviews

Start your review of Scaling Down, Powering Up - Can Efficient Training Beat Scaling Laws?

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.