Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

Self-Host a Local AI Platform - Ollama and Open WebUI

Christian Lempa via YouTube

Overview

Coursera Flash Sale
40% Off Coursera Plus for 3 Months!
Grab it
Learn to build and deploy a complete self-hosted AI platform using Ollama and Open WebUI in a HomeLab environment through this comprehensive 39-minute tutorial. Discover the hardware requirements needed for running local AI models, including GPU specifications and AMD ROCm compatibility considerations. Follow along as the setup process covers installing Ollama on Linux within an LXC container, configuring the platform basics, and integrating Open WebUI for an intuitive web interface with advanced features. Explore securing your AI platform using Traefik reverse proxy and Authentik authentication, while learning valuable troubleshooting tips to avoid common deployment pitfalls. Master the fundamentals of using local AI models, implementing web search capabilities, and understanding the practical limitations and trust considerations when working with self-hosted AI solutions.

Syllabus

00:00 Introduction
02:38 Hardware Requirements
06:58 Software Planning
08:35 Problems with Proxmox…
10:55 Installing a new LXC Container
14:44 Install Ollama on Linux
17:41 Ollama basics
19:56 Install OpenWeb UI
27:37 OpenWebUI basics
31:05 Using AI models
35:01 Web Searching
37:18 Why I still don’t trust AI

Taught by

Christian Lempa

Reviews

Start your review of Self-Host a Local AI Platform - Ollama and Open WebUI

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.