Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

RamaLama - Running AI Models in Containers

DevConf via YouTube

Overview

Coursera Flash Sale
40% Off Coursera Plus for 3 Months!
Grab it
Learn how to run AI models locally and in production using RamaLama, a new tool that simplifies AI deployment through container technology. Discover how this 35-minute conference talk by Daniel Walsh demonstrates combining AI and container technologies to make running AI models straightforward and "boring." Explore the process of getting AI models running in containers on your laptop, then scaling them to production environments including edge devices and Kubernetes clusters. Understand how RamaLama bridges the gap between local AI development and production deployment, providing developers with an efficient workflow for containerized AI model management across different computing environments.

Syllabus

RamaLama: Running AI Models in Containers. - DevConf.CZ 2025

Taught by

DevConf

Reviews

Start your review of RamaLama - Running AI Models in Containers

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.