You’re only 3 weeks away from a new language
Learn EDR Internals: Research & Development From The Masters
Overview
Coursera Flash Sale
40% Off Coursera Plus for 3 Months!
Grab it
Learn to build NeuroMind, a local-first AI assistant with persistent memory capabilities using Ollama, LangChain, and SQLite in this comprehensive tutorial. Discover how to create a CLI application that runs entirely on your machine, featuring multiple personas, real-time thought streaming, and conversation storage in a SQLite database. Explore the complete application architecture from the ground up, starting with project structure and progressing through implementing a memory layer with SQLite for persistent conversation storage. Master domain logic implementation for managing AI assistant functionality, develop a token streaming processor for real-time response handling, and create an intuitive terminal user interface using the rich library. Complete the build by implementing a REST API with FastAPI to enable external integrations. Follow along with the provided source code and timestamps to build each component systematically, from the initial demo through the final API implementation, gaining hands-on experience with modern AI development tools and techniques for creating privacy-focused, locally-running AI assistants.
Syllabus
00:00 - NeuroMind demo
00:32 - App architecture
03:03 - Project structure
04:58 - Memory layer with SQLite
07:05 - Domain logic
09:56 - Token streaming processor
10:44 - Terminal UI with rich
11:21 - Rest API with FastAPI
Taught by
Venelin Valkov