Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

Heterogeneous Hybrid Distributed Training for Large-Scale Language Models

OpenInfra Foundation via YouTube

Overview

Coursera Flash Sale
40% Off Coursera Plus for 3 Months!
Grab it
Learn about the technical challenges and solutions in heterogeneous distributed training for Large Language Models (LLMs) in this 11-minute conference talk. Explore how integrating different computing resources for distributed parallel acceleration can support the development of LLMs with hundreds of billions of parameters. Discover the research conducted by China Mobile and industry partners to overcome challenges related to GPU architecture differences, memory constraints, and vendor hardware incompatibilities. Gain insights into the core functional components of a training system designed to enable heterogeneous GPUs to work together effectively, contributing to the advancement of the intelligent computing ecosystem.

Syllabus

Heterogeneous Hybrid Distributed Training Helps the Development of Large-Scale Language Model

Taught by

OpenInfra Foundation

Reviews

Start your review of Heterogeneous Hybrid Distributed Training for Large-Scale Language Models

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.