Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

Scaling Down, Powering Up: Can Efficient Training Beat Scaling Laws?

MLOps World: Machine Learning in Production via YouTube

Overview

Coursera Flash Sale
40% Off Coursera Plus for 3 Months!
Grab it
This conference talk from MLOps World: Machine Learning in Production features Malikeh Ehghaghi, Machine Learning Research Scientist at Vector Institute, challenging traditional beliefs about scaling laws in large language models (LLMs). Explore how innovative strategies prioritizing efficiency and cost-effectiveness can potentially outperform simply scaling up model parameters and data volume. Discover DeepSeek's success story as evidence that thoughtful data engineering and meticulous model design can achieve superior AI performance without prohibitive costs. The 31-minute presentation covers state-of-the-art data-centric approaches (data mixing, filtering, deduplication) and model-centric strategies (pruning, distillation, parameter-efficient finetuning, quantization, model merging) for optimal language model training. Learn about the rise of small language models (SLMs) as cost-efficient alternatives to dense LLMs, and how strategic preparation can produce superior results without massive financial investments traditionally considered necessary for scaling AI systems.

Syllabus

Scaling Down, Powering Up: Can Efficient Training Beat Scaling Laws?

Taught by

MLOps World: Machine Learning in Production

Reviews

Start your review of Scaling Down, Powering Up: Can Efficient Training Beat Scaling Laws?

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.