Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

From Mixture of Experts to Mixture of Agents with Super Fast Inference

AI Engineer via YouTube

Overview

Coursera Spring Sale
40% Off Coursera Plus Annual!
Grab it
Learn to build your own Mixture of Agents (MoA) system using cutting-edge open models like Qwen3-32B and Llama 3.3-70B in this hands-on workshop. Discover how MoA represents an emerging architecture that combines multiple large language models in a layered, agent-based design, enabling specialized agents to collaborate across layers for superior performance that outperforms today's frontier models in both accuracy and efficiency. Explore the foundational concepts by examining how Mixture of Experts (MoE) architectures continue advancing scale and specialization boundaries, with insights from Cerebras's Head Research Scientist on training state-of-the-art MoEs. Gain practical experience implementing these advanced AI architectures while understanding their theoretical underpinnings and real-world applications for building more capable and efficient AI systems.

Syllabus

From Mixture of Experts to Mixture of Agents with Super Fast Inference - Daniel Kim & Daria Soboleva

Taught by

AI Engineer

Reviews

Start your review of From Mixture of Experts to Mixture of Agents with Super Fast Inference

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.