Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

Run LLMs Locally With Docker Model Runner

Krish Naik via YouTube

Overview

Coursera Flash Sale
40% Off Coursera Plus for 3 Months!
Grab it
Learn to run large language models locally using Docker Model Runner in this 19-minute tutorial that addresses the common challenges developers face when implementing AI in modern applications. Discover how to overcome the typical pain points of selecting appropriate models, handling hardware compatibility issues, and optimizing performance for local LLM deployment. Explore the flexibility benefits of running LLMs locally for development environments, testing scenarios, and offline use cases. Master the Docker-based approach to simplify the process of local LLM deployment, eliminating the complexity traditionally associated with model setup and configuration. Gain practical knowledge for implementing AI capabilities in your applications while maintaining control over your development environment and reducing dependency on external services.

Syllabus

Run LLMs Locally With Docker Model Runner

Taught by

Krish Naik

Reviews

Start your review of Run LLMs Locally With Docker Model Runner

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.