Deploying Lightweight AI Agents at the Healthcare Edge With K8s and Ollama
CNCF [Cloud Native Computing Foundation] via YouTube
PowerBI Data Analyst - Create visualizations and dashboards from scratch
Our career paths help you become job ready faster
Overview
Coursera Spring Sale
40% Off Coursera Plus Annual!
Grab it
Explore how to deploy lightweight AI agents at the healthcare edge using Kubernetes and Ollama in this conference talk from KubeCon + CloudNativeCon. Learn to overcome the challenges of traditional centralized LLMs including high latency, data privacy concerns, and steep cloud costs by implementing Kubernetes-native deployments of AI agents powered by Ollama and K3s/MicroK8s for intelligent, autonomous operations directly at the healthcare edge. Discover a real-world architecture where multi-agent systems orchestrate hospital workflows like patient triage, imaging coordination, and resource scheduling without sending sensitive data offsite. Understand how small LLMs deployed locally can drive powerful workflows, examine the Kubernetes primitives used to scale and monitor agents, and see how this approach achieves both operational efficiency and regulatory compliance with HIPAA and GDPR requirements. Gain insights into cloud-native engineering, AI orchestration, and real healthcare needs while obtaining a blueprint for deploying resilient, scalable AI agent ecosystems anywhere edge computing is needed.
Syllabus
Deploying Lightweight AI Agents at the Healthcare Edge With K8s + Ollama - Gary Arora & Samarth Shah
Taught by
CNCF [Cloud Native Computing Foundation]