Completed
9:59 Playbook 2 – vllm_installer deployment
Class Central Classrooms beta
YouTube videos curated by Class Central.
Classroom Contents
Deploy vLLM on AWS in Under 10 Minutes
Automatically move to the next video in the Classroom when playback concludes
- 1 0:00 Why vLLM and why it’s so fast
- 2 1:22 How vLLM optimizes memory & inference performance
- 3 3:29 AWS service quota requirement for GPU instances
- 4 4:18 Best AWS instance to use for just getting started
- 5 5:03 Ansible + collection prerequisites
- 6 6:04 AWS CLI and credential setup
- 7 7:11 Creating a Hugging Face access token
- 8 7:58 Playbook 1 – aws_helper walkthrough
- 9 9:56 Reviewing the generated vars file
- 10 9:59 Playbook 2 – vllm_installer deployment
- 11 10:40 Instance provisioning & dependency installation
- 12 11:45 vLLM server is live
- 13 12:03 Testing with curl