Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

Federated Open Language Models: Training LMs on Distributed Data

Simons Institute via YouTube

Overview

Coursera Flash Sale
40% Off Coursera Plus for 3 Months!
Grab it
This talk by Sewon Min from UC Berkeley explores the concept of Federated Open Language Models, focusing on techniques for training language models on distributed data. Learn about the challenges and solutions for developing LMs when data is spread across multiple locations rather than centralized, a critical approach for privacy-preserving and resource-efficient AI development. Part of the "The Future of Language Models and Transformers" series at the Simons Institute, this presentation offers valuable insights into emerging methodologies that could shape how large language models are trained and deployed in distributed computing environments.

Syllabus

Federated Open Language Models: Training LMs on Distributed Data

Taught by

Simons Institute

Reviews

Start your review of Federated Open Language Models: Training LMs on Distributed Data

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.