PowerBI Data Analyst - Create visualizations and dashboards from scratch
40% Off All Coursera Courses
Overview
Coursera Flash Sale
40% Off Coursera Plus for 3 Months!
Grab it
Learn how to run AI models locally and in production using RamaLama, a new tool that simplifies AI deployment through container technology. Discover how this 35-minute conference talk by Daniel Walsh demonstrates combining AI and container technologies to make running AI models straightforward and "boring." Explore the process of getting AI models running in containers on your laptop, then scaling them to production environments including edge devices and Kubernetes clusters. Understand how RamaLama bridges the gap between local AI development and production deployment, providing developers with an efficient workflow for containerized AI model management across different computing environments.
Syllabus
RamaLama: Running AI Models in Containers. - DevConf.CZ 2025
Taught by
DevConf