All Coursera Certificates 40% Off
Power BI Fundamentals - Create visualizations and dashboards from scratch
Overview
Coursera Flash Sale
40% Off Coursera Plus for 3 Months!
Grab it
This 45-minute webinar explores how to streamline AI deployments on Kubernetes using open source k0rdent. Learn how to automate GPU-ready cluster provisions, deploy scalable AI models with Kserve, optimize inference workloads using Knative & Istio, and enhance flexibility for AI-based Kubernetes. Join Bharath Nallapeta, Sr. Software Engineer at Mirantis, for a complete breakdown of AI and Kubernetes integration, covering GPU access, monitoring strategies with Prometheus and Grafana, and cost-effective auto-scaling techniques. The demonstration includes practical examples of AI workload cluster provisioning, deploying AI/ML workloads, and testing deployments. Discover how k0rdent empowers AI teams to deploy, scale, and optimize models effortlessly while ensuring maximum performance at minimal cost. The webinar includes chapters covering multi-cluster/multi-tenancy background, platform engineering necessities, multi-cloud environments, and a comprehensive overview of the k0rdent ecosystem.
Syllabus
0:00 - Intro & background on multi-cluster/multi-tenancy
2:22 - K8s necessitates platform engineering
5:57 - Multi-cluster/multi-cloud is the new norm
8:39 - Overview of open source k0rdent
11:45 - AI deployments w Kserve/Knative + AI Monitoring w Prometheus/Grafana
16:56 - k0rdent Demo pt1: AI workload cluster provisioning
26:32 - k0rdent Demo pt2: deploying AI/ML workloads
35:38 - Testing the Demo deployment
40:17 - Recap
43:04 - k0rdent application/infrastructure catalog
44:58 - Outro
Taught by
Mirantis