Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

Open Source GenAI on OpenStack - Part One LLMs

OpenInfra Foundation via YouTube

Overview

Coursera Flash Sale
40% Off Coursera Plus for 3 Months!
Grab it
Learn to deploy open-source Generative AI workloads on OpenStack infrastructure in this conference talk featuring Julian Pistorius, Mike Lowe, and Martial Michel. Discover how OpenStack's modular, open design enables organizations to run powerful AI models using existing infrastructure while maintaining data sovereignty, cost control, and flexibility. Explore practical deployment strategies for open-source GenAI software, including Large Language Models (LLMs) and Stable Diffusion, on GPU-enabled OpenStack environments using containerized services and subnet-routable floating IPs. Examine real-world examples of building secure, scalable, self-hosted GenAI services for internal applications and research purposes. Understand how deployed models can support advanced use cases such as local embeddings for RAG (Retrieval-Augmented Generation) pipelines, semantic search capabilities, agent-based automation systems, and LoRA-based fine-tuning techniques for domain-specific performance optimization.

Syllabus

Open Source GenAI on OpenStack Part One LLMs

Taught by

OpenInfra Foundation

Reviews

Start your review of Open Source GenAI on OpenStack - Part One LLMs

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.