MIT Sloan: Lead AI Adoption Across Your Organization — Not Just Pilot It
AI, Data Science & Cloud Certificates from Google, IBM & Meta
Overview
Google, IBM & Meta Certificates — All 10,000+ Courses at 40% Off
One annual plan covers every course and certificate on Coursera. 40% off for a limited time.
Get Full Access
Explore the architecture and training methodologies behind Mistral AI's cutting-edge large language models in this 18-minute conference talk by Research Scientist Devendra Singh Chaplot. Discover the technical foundations of Mistral AI's open-source models, including the Mixtral 8x7B and Mixtral 8x22B, which utilize innovative mixture-of-experts (MoE) architecture and are available under the Apache 2.0 license. Learn practical implementation strategies for leveraging Mistral "La Plateforme" API endpoints and gain insights into upcoming platform features. Recorded live at the AI Engineer World's Fair in San Francisco, this presentation provides both theoretical understanding of modern LLM architectures and hands-on guidance for developers working with Mistral's ecosystem. The talk draws from Chaplot's extensive research background in machine learning, computer vision, and robotics, offering perspectives from his work at both Mistral AI and Facebook AI Research, where he led award-winning AI systems and contributed to breakthrough developments in artificial intelligence.
Syllabus
Decoding Mistral AI's Large Language Models: Devendra Chaplot
Taught by
AI Engineer