Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

ModelPack - An Open Standard for Packaging, Distributing, and Running LLM in Cloud-Native Environments

OpenInfra Foundation via YouTube

Overview

Coursera Flash Sale
40% Off Coursera Plus for 3 Months!
Grab it
Learn about ModelPack, an open standard for packaging, distributing, and running AI artifacts in cloud-native environments through this 36-minute conference talk. Discover how infrastructure is evolving from container-centric to model-centric approaches, with ModelPack serving as the bridge between traditional cloud-native infrastructure and modern AI/LLM workloads. Explore how this standard, previously known as ModelSpec, standardizes AI/LLM models as OCI artifacts and connects leading open-source cloud-native projects. Understand the technical details of the specification and its current adoption status within the community. See practical implementations demonstrated through real-world applications at Ant Group, showcasing seamless management and operation of AI/LLM models using established cloud-native infrastructure including Kubernetes, Dragonfly, Harbor, and CRI-O. Gain insights into how ModelPack addresses the gap between existing container orchestration systems and the unique requirements of AI model deployment and distribution in production environments.

Syllabus

ModelPack: An Open Standard for Packaging, Distributing, and Running LLM in CloudNative Environments

Taught by

OpenInfra Foundation

Reviews

Start your review of ModelPack - An Open Standard for Packaging, Distributing, and Running LLM in Cloud-Native Environments

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.