ModelPack - An Open Standard for Packaging, Distributing, and Running LLM in Cloud-Native Environments
OpenInfra Foundation via YouTube
Start speaking a new language. It’s just 3 weeks away.
Power BI Fundamentals - Create visualizations and dashboards from scratch
Overview
Coursera Flash Sale
40% Off Coursera Plus for 3 Months!
Grab it
Learn about ModelPack, an open standard for packaging, distributing, and running AI artifacts in cloud-native environments through this 36-minute conference talk. Discover how infrastructure is evolving from container-centric to model-centric approaches, with ModelPack serving as the bridge between traditional cloud-native infrastructure and modern AI/LLM workloads. Explore how this standard, previously known as ModelSpec, standardizes AI/LLM models as OCI artifacts and connects leading open-source cloud-native projects. Understand the technical details of the specification and its current adoption status within the community. See practical implementations demonstrated through real-world applications at Ant Group, showcasing seamless management and operation of AI/LLM models using established cloud-native infrastructure including Kubernetes, Dragonfly, Harbor, and CRI-O. Gain insights into how ModelPack addresses the gap between existing container orchestration systems and the unique requirements of AI model deployment and distribution in production environments.
Syllabus
ModelPack: An Open Standard for Packaging, Distributing, and Running LLM in CloudNative Environments
Taught by
OpenInfra Foundation