Running Large Language Models on AMD Strix Halo AI Ryzen MAX+ 395 - GLM 4.5-Air-106B and Qwen3-235B Tutorial

Running Large Language Models on AMD Strix Halo AI Ryzen MAX+ 395 - GLM 4.5-Air-106B and Qwen3-235B Tutorial

Donato Capitella via YouTube Direct link

06:46 - AMD "Strix Halo" Mini PCs

4 of 13

4 of 13

06:46 - AMD "Strix Halo" Mini PCs

Class Central Classrooms beta

YouTube videos curated by Class Central.

Classroom Contents

Running Large Language Models on AMD Strix Halo AI Ryzen MAX+ 395 - GLM 4.5-Air-106B and Qwen3-235B Tutorial

Automatically move to the next video in the Classroom when playback concludes

  1. 1 00:00 - Introduction to AMD "Strix Halo" Ryzen AI MAX 395
  2. 2 01:39 - TL;DR
  3. 3 04:39 - Running LLMs Locally
  4. 4 06:46 - AMD "Strix Halo" Mini PCs
  5. 5 09:36 - HP Z2 G1a Mini Workstation
  6. 6 11:59 - My Setup Memory + Llama.cpp Builds
  7. 7 14:00 - Vulkan AMDVLK/RADV/ROCm
  8. 8 15:33 - AMD ROCm
  9. 9 17:08 - Fedora-Based Toolboxes
  10. 10 17:32 - Benchmark Results
  11. 11 20:50 - Memory Requirements Context Size
  12. 12 23:57 - Credits
  13. 13 24:58 - Conclusion

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.