Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

Google

Scaling Up JAX and Flax NNX Models with Distributed Computing - Part 2

Google via YouTube

Overview

Coursera Flash Sale
40% Off Coursera Plus for 3 Months!
Grab it
Learn how to scale Flax NNX models using JAX's distributed computing capabilities and SPMD paradigm in this 11-minute tutorial from Google. Discover JAX's approach to parallelism and explore seamless integration with NNX, particularly focusing on the main workflow for integrating JAX's sharding primitives with Flax NNX. Master the critical sharded initialization pattern essential for scaling modern models that outgrow single accelerators, making this especially valuable for developers transitioning from PyTorch to JAX and Flax NNX.

Syllabus

Scaling Up (Part 2)

Taught by

Google Developers

Reviews

Start your review of Scaling Up JAX and Flax NNX Models with Distributed Computing - Part 2

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.