LangChain Quickstart with Local LLM - Ollama, Pydantic Structured Output, Tool Use, MLflow Tracing

LangChain Quickstart with Local LLM - Ollama, Pydantic Structured Output, Tool Use, MLflow Tracing

Venelin Valkov via YouTube Direct link

- The "Fragile AI Script" Problem

1 of 7

1 of 7

- The "Fragile AI Script" Problem

Class Central Classrooms beta

YouTube videos curated by Class Central.

Classroom Contents

LangChain Quickstart with Local LLM - Ollama, Pydantic Structured Output, Tool Use, MLflow Tracing

Automatically move to the next video in the Classroom when playback concludes

  1. 1 - The "Fragile AI Script" Problem
  2. 2 - Local Setup with Ollama & uv
  3. 3 - LLM Abstraction init_chat_model
  4. 4 - Prompt Templates as Functions
  5. 5 - Structured Output with Pydantic
  6. 6 - Implementing Tool Calling
  7. 7 - Bonus Tip

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.