Gain a Splash of New Skills - Coursera+ Annual Nearly 45% Off
AI Adoption - Drive Business Value and Organizational Impact
Overview
Coursera Spring Sale
40% Off Coursera Plus Annual!
Grab it
Learn to build your own Mixture of Agents (MoA) system using cutting-edge open models like Qwen3-32B and Llama 3.3-70B in this hands-on workshop. Discover how MoA represents an emerging architecture that combines multiple large language models in a layered, agent-based design, enabling specialized agents to collaborate across layers for superior performance that outperforms today's frontier models in both accuracy and efficiency. Explore the foundational concepts by examining how Mixture of Experts (MoE) architectures continue advancing scale and specialization boundaries, with insights from Cerebras's Head Research Scientist on training state-of-the-art MoEs. Gain practical experience implementing these advanced AI architectures while understanding their theoretical underpinnings and real-world applications for building more capable and efficient AI systems.
Syllabus
From Mixture of Experts to Mixture of Agents with Super Fast Inference - Daniel Kim & Daria Soboleva
Taught by
AI Engineer