Learn Backend Development Part-Time, Online
PowerBI Data Analyst - Create visualizations and dashboards from scratch
Overview
Why Pay Per Course When You Can Get All of Coursera for 40% Off?
10,000+ courses, Google, IBM & Meta certificates, one annual plan at 40% off. Upgrade now.
Get Full Access
Explore Docker Compose's latest features designed for AI and cloud development workflows in this comprehensive 51-minute conference talk from Devoxx. Discover how recent updates to Docker Compose, including the new models attribute and provider services, enable declarative definition and execution of LLM-powered applications through live demonstrations with minimal slides. Learn to integrate AI models into applications using the models section, connect to remote services via Telepresence provider services, and leverage Docker Offload to run computationally intensive workloads remotely when local resources are insufficient. Watch practical examples of defining LLMs like llama3 and qwen in Compose files, accessing these models from service containers, utilizing provider services for external resource preparation and connection, and executing GPU-heavy tasks remotely while maintaining your existing development workflow. Understand how to use Compose as a unified orchestration tool for managing everything from code to context in modern multi-service AI applications and agent-based systems.
Syllabus
Docker Compose your Dev Toolkit for AI and Cloud workflows by Guillaume Lours
Taught by
Devoxx