Gain a Splash of New Skills - Coursera+ Annual Just ₹7,999
Introduction to Programming with Python
Overview
Coursera Flash Sale
40% Off Coursera Plus for 3 Months!
Grab it
Explore how to build a fully local LLM-powered "second brain" assistant using open source technologies that runs entirely on your laptop. Learn to combine Ollama, LangChain, OpenWebUI, CrewAI, and Granite models to create an AI assistant that doesn't require sending sensitive data to external servers or relying on models with unknown data provenance. Discover the practical considerations of working with local quantized models versus cloud-hosted solutions, including latency challenges when running on laptop hardware and the effectiveness trade-offs of working with 7-8 billion parameter models. Examine how reasoning and multimodal capabilities enhance assistant functionality while maintaining complete data privacy and control. Gain insights into the open source LLM landscape and understand the technical complexities involved in building energy-efficient, locally-hosted AI solutions that keep your most important data secure on your own hardware.
Syllabus
Building Your (Local) LLM Second Brain - Olivia Buzek, IBM
Taught by
Linux Foundation