Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

Linux Foundation

LLMs on Autopilot - Running AI Agents on Kubernetes With Open Source Tools

Linux Foundation via YouTube

Overview

Coursera Spring Sale
40% Off Coursera Plus Annual!
Grab it
Discover how to deploy and run open source LLM agents in production environments using Kubernetes and cloud-native tools in this 42-minute conference talk from the Linux Foundation. Learn to move beyond "demo-only" AI implementations by exploring a comprehensive architecture that combines LangChain, Autogen, or CrewAI agents with enterprise-grade infrastructure tools including Crossplane for infrastructure orchestration, Ray for distributed compute, and Prometheus for observability. Explore the practical automation patterns, architectural decisions, and common pitfalls that emerge when transitioning AI agents from development to production deployment. Watch a live demonstration of a fully autonomous AI agent operating within a cloud-native stack, showcasing how agents can plan, execute, and scale tasks with minimal human intervention. Gain insights into the operational challenges and solutions for running AI agents reliably in real-world production environments, moving beyond laptop-based prototypes to scalable, production-ready deployments.

Syllabus

LLMs on Autopilot: Running AI Agents on Kubernetes With Open Source Tools - Annie Talvasto, Waovo

Taught by

Linux Foundation

Reviews

Start your review of LLMs on Autopilot - Running AI Agents on Kubernetes With Open Source Tools

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.