Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

Spotify's Approach to Distributed LLM Training with Ray on GKE

Anyscale via YouTube

Overview

Coursera Flash Sale
40% Off Coursera Plus for 3 Months!
Grab it
Explore Spotify's innovative approach to distributed Large Language Model (LLM) training in this Ray Summit 2024 breakout session. Discover how Spotify adapts to Generative AI demands by building an ML platform with Ray on Google Kubernetes Engine (GKE). Learn about their implementation of LLM support for training models exceeding 70B parameters, management of diverse machine types including NVIDIA H100 GPUs, and Kubernetes-based resource allocation. Gain insights into performance optimization techniques like compact placement and NCCL Fast Socket. Understand how Ray is leveraged to distribute training applications across GKE-managed resources, providing valuable information for organizations aiming to implement or enhance their LLM training capabilities using cloud-based solutions with Ray and Kubernetes.

Syllabus

Spotify's Approach to Distributed LLM Training with Ray on GKE | Ray Summit 2024

Taught by

Anyscale

Reviews

Start your review of Spotify's Approach to Distributed LLM Training with Ray on GKE

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.