ModelPack - An Open Standard for Packaging, Distributing, and Running LLM in Cloud-Native Environments
OpenInfra Foundation via YouTube
Google AI Professional Certificate - Learn AI Skills That Get You Hired
Live Online Classes in Design, Coding & AI — Small Classes, Free Retakes
Overview
Google, IBM & Meta Certificates — All 10,000+ Courses at 40% Off
One annual plan covers every course and certificate on Coursera. 40% off for a limited time.
Get Full Access
Learn about ModelPack, an open standard for packaging, distributing, and running AI artifacts in cloud-native environments through this 36-minute conference talk. Discover how infrastructure is evolving from container-centric to model-centric approaches, with ModelPack serving as the bridge between traditional cloud-native infrastructure and modern AI/LLM workloads. Explore how this standard, previously known as ModelSpec, standardizes AI/LLM models as OCI artifacts and connects leading open-source cloud-native projects. Understand the technical details of the specification and its current adoption status within the community. See practical implementations demonstrated through real-world applications at Ant Group, showcasing seamless management and operation of AI/LLM models using established cloud-native infrastructure including Kubernetes, Dragonfly, Harbor, and CRI-O. Gain insights into how ModelPack addresses the gap between existing container orchestration systems and the unique requirements of AI model deployment and distribution in production environments.
Syllabus
ModelPack: An Open Standard for Packaging, Distributing, and Running LLM in CloudNative Environments
Taught by
OpenInfra Foundation