Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

15B Active MoE Beats Opus 4.6 in Reasoning

Discover AI via YouTube

Overview

Coursera Flash Sale
40% Off Coursera Plus for 3 Months!
Grab it
Explore the technical architecture of a powerful 15-billion parameter active Mixture-of-Experts (MoE) model that outperforms Opus 4.6 in reasoning tasks. Dive deep into the internal workings of this advanced reasoning engine, examining the innovative methods and optimization algorithms used in building open-source MoE AI models. Learn about the cutting-edge techniques presented in the MiMo-V2-Flash Technical Report from LLM-Core Xiaomi, understanding how modern MoE architectures achieve superior performance through sophisticated expert routing and activation strategies. Gain insights into the technical foundations that enable this model to excel in complex reasoning scenarios while maintaining efficiency through its mixture-of-experts design.

Syllabus

15B Active MoE BEATS OPUS 4.6 in Reasoning

Taught by

Discover AI

Reviews

Start your review of 15B Active MoE Beats Opus 4.6 in Reasoning

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.