Build AI Apps with Azure, Copilot, and Generative AI — Microsoft Certified
Google, IBM & Microsoft Certificates — All in One Plan
Overview
Google, IBM & Meta Certificates — All 10,000+ Courses at 40% Off
One annual plan covers every course and certificate on Coursera. 40% off for a limited time.
Get Full Access
Explore how to build a fully local LLM-powered "second brain" assistant using open source technologies that runs entirely on your laptop. Learn to combine Ollama, LangChain, OpenWebUI, CrewAI, and Granite models to create an AI assistant that doesn't require sending sensitive data to external servers or relying on models with unknown data provenance. Discover the practical considerations of working with local quantized models versus cloud-hosted solutions, including latency challenges when running on laptop hardware and the effectiveness trade-offs of working with 7-8 billion parameter models. Examine how reasoning and multimodal capabilities enhance assistant functionality while maintaining complete data privacy and control. Gain insights into the open source LLM landscape and understand the technical complexities involved in building energy-efficient, locally-hosted AI solutions that keep your most important data secure on your own hardware.
Syllabus
Building Your (Local) LLM Second Brain - Olivia Buzek, IBM
Taught by
Linux Foundation