Overview
Coursera Flash Sale
40% Off Coursera Plus for 3 Months!
Grab it
Learn how to run advanced DeepSeek v3 Reasoning Models locally on your laptop through this 16-minute tutorial video. Get step-by-step guidance on downloading and installing LM Studio, navigating its interface, and selecting appropriate models based on your system specifications. Discover how to test model performance, work with larger models, and understand document interaction capabilities and limitations. Explore recommendations for different hardware configurations, from 8GB RAM Macs to high-powered systems with 128GB RAM. For users with older machines, find alternative solutions using llamafile for Intel-based Macs. Master practical tips for optimal model performance and learn about various model options including DeepSeek-R1-Distill variants of Qwen and Llama models, with specific guidance for Windows and Mac users at different RAM capacities.
Syllabus
00:11 Downloading and Installing LM Studio
00:41 Navigating LM Studio Interface
01:01 Choosing and Running Models DeepSeek v3 Reasoning Models via Qwen and Llama Distills
02:49 Testing Model Performance
05:19 Using Larger Models
09:32 Document Interaction and Limitations
12:53 Mistral Small not reasoning
13:45 Alternative Methods for Older Macs/PCs - llamafile
15:51 Conclusion and Final Tips
Taught by
Trelis Research