Overview
Coursera Flash Sale
40% Off Coursera Plus for 3 Months!
Grab it
Learn to deploy Stable Diffusion and other open-source generative AI models on OpenStack infrastructure in this technical conference talk. Discover how OpenStack's modular, open design enables organizations to run powerful AI models using existing infrastructure while maintaining data sovereignty, cost control, and flexibility. Explore practical deployment strategies using GPU-enabled OpenStack with containerized services and subnet-routable floating IPs. See demonstrations of building secure, scalable, self-hosted GenAI services for internal applications and research purposes. Understand how deployed models can support advanced use cases including local embeddings for RAG pipelines, semantic search, agent-based automation, and LoRA-based fine-tuning for domain-specific performance optimization.
Syllabus
Open-Source GenAI on OpenStack (Part Two Stable Diffusion)
Taught by
OpenInfra Foundation