Mastering LLM Delivery in Private Clouds: A Journey to Seamless Deployments with Kubernetes and OCI
CNCF [Cloud Native Computing Foundation] via YouTube
Launch a New Career with Certificates from Google, IBM & Microsoft
UC San Diego Product Management Certificate — AI-Powered PM Training
Overview
Google, IBM & Meta Certificates — All 10,000+ Courses at 40% Off
One annual plan covers every course and certificate on Coursera. 40% off for a limited time.
Get Full Access
Explore a case study on simplifying private Large Language Model (LLM) deployments using cloud native technologies, specifically Kubernetes and OCI artifacts. Discover how these tools address data governance and security challenges while enabling efficient sharing of large artifacts between model developers and consumers. Learn about the benefits of Kubernetes in delivering a highly portable, cloud-native inference stack, and understand how OCI Artifacts can be leveraged to achieve significant efficiency gains by reducing duplicate storage, increasing download speed, and minimizing governance overhead. Gain valuable insights into incorporating Kubernetes and OCI into your MLOps journey for seamless LLM delivery in private cloud environments.
Syllabus
Mastering LLM Delivery in Private Clouds: A Journey to Seamless Dep... Autumn Moulder & Marwan Ahmed
Taught by
CNCF [Cloud Native Computing Foundation]