Get 50% Off Udacity Nanodegrees — Code CC50
AI Product Expert Certification - Master Generative AI Skills
Overview
Coursera Flash Sale
40% Off Coursera Plus for 3 Months!
Grab it
Explore how to build robust, scalable AI agents using open source components in this conference talk from DevConf.US 2025. Learn why many LLM-based AI agents fail in real-world applications due to fragility, scaling issues, and tight coupling to specific toolchains, then discover how the Model Context Protocol (MCP) addresses these challenges. Understand MCP as an open standard that functions like a USB-C port for AI applications, enabling seamless connectivity between AI assistants and real data sources including content repositories, business tools, and development environments without requiring custom integrations for every tool. Follow along as the speakers demonstrate a complete open AI agent architecture that integrates vLLM for efficient model inference, Llama Stack as the open source agent framework, MCP for tool invocation and data flow management, and Kubernetes for scalable cloud-native deployment. Gain practical insights through live demonstrations of the system in action, architectural walkthroughs, and lessons learned from real implementation experience. Determine whether MCP represents genuine innovation or mere hype in the AI ecosystem, and acquire actionable knowledge for building more dynamic and efficient AI applications using open source technologies.
Syllabus
Putting AI Agents to Work: A New Era of Open Source Connectivity with MCP - DevConf.US 2025
Taught by
DevConf