The Easiest Ways to Run LLMs Locally - Docker Model Runner Tutorial

The Easiest Ways to Run LLMs Locally - Docker Model Runner Tutorial

Tech With Tim via YouTube Direct link

07:43 | Model Runner vs Ollama

7 of 9

7 of 9

07:43 | Model Runner vs Ollama

Class Central Classrooms beta

YouTube videos curated by Class Central.

Classroom Contents

The Easiest Ways to Run LLMs Locally - Docker Model Runner Tutorial

Automatically move to the next video in the Classroom when playback concludes

  1. 1 00:00 | Introducing Docker Model Runner
  2. 2 00:54 | System Requirements
  3. 3 02:19 | Setup/Install
  4. 4 03:50 | Using Models from Docker Desktop
  5. 5 04:12 | Using Models from Command Line
  6. 6 06:41 | How it Works
  7. 7 07:43 | Model Runner vs Ollama
  8. 8 09:11 | Simple Python Example
  9. 9 12:22 | Containerized Application Example

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.