Completed
0:00 - Use cases: hybrid model architecture, LLMS agents, data boundary control
Class Central Classrooms beta
YouTube videos curated by Class Central.
Classroom Contents
Fast and Flexible Inference on Open-Source AI Models at Scale - BRK117
Automatically move to the next video in the Classroom when playback concludes
- 1 0:00 - Use cases: hybrid model architecture, LLMS agents, data boundary control
- 2 00:09:09 - Introduction to GPU-intensive workloads like physics and video processing
- 3 00:11:47 - Docker Compose for AI agents and simplified cloud deployment
- 4 00:16:00 - Live testing of the dashboard generator and log streaming visualization
- 5 00:20:31 - AKS investment areas: scale, security, cost optimization and AI support
- 6 00:25:04 - Enhanced workload scheduling and configuration for AI workloads
- 7 00:30:25 - Inference traffic management using Gateway API and Ignite demo preview
- 8 00:35:11 - RBC’s CI/CD pipeline accelerating secure GPU resource provisioning
- 9 00:38:01 - RBC strategy: building Canada’s largest AI farm within compliance boundaries