40% Off Career-Building Certificates
Power BI Fundamentals - Create visualizations and dashboards from scratch
Overview
Coursera Spring Sale
40% Off Coursera Plus Annual!
Grab it
Discover how to deploy and run open source LLM agents in production environments using Kubernetes and cloud-native tools in this 42-minute conference talk from the Linux Foundation. Learn to move beyond "demo-only" AI implementations by exploring a comprehensive architecture that combines LangChain, Autogen, or CrewAI agents with enterprise-grade infrastructure tools including Crossplane for infrastructure orchestration, Ray for distributed compute, and Prometheus for observability. Explore the practical automation patterns, architectural decisions, and common pitfalls that emerge when transitioning AI agents from development to production deployment. Watch a live demonstration of a fully autonomous AI agent operating within a cloud-native stack, showcasing how agents can plan, execute, and scale tasks with minimal human intervention. Gain insights into the operational challenges and solutions for running AI agents reliably in real-world production environments, moving beyond laptop-based prototypes to scalable, production-ready deployments.
Syllabus
LLMs on Autopilot: Running AI Agents on Kubernetes With Open Source Tools - Annie Talvasto, Waovo
Taught by
Linux Foundation