Overview
Coursera Flash Sale
40% Off Coursera Plus for 3 Months!
Grab it
Explore the evolution and future of PyTorch as the foundational framework for AI development in this 30-minute conference talk. Discover how PyTorch has transformed over the past two years alongside the remarkable changes in the AI landscape, from the rise of Large Language Models powering applications like ChatGPT to the open revolution led by models like Llama and the emergence of agentic systems. Learn about PyTorch's role as a central hub that brings together diverse perspectives within a community collectively building a comprehensive framework integrating all layers of AI development. Examine the current PyTorch ecosystem and its widespread adoption in AI research, while understanding the major challenges facing the transition into a generative AI and agent-first world. Gain insights into PyTorch's broader vision and upcoming developments, including planet-scale training and inference capabilities, the complexities of innovating Single Program Multiple Data (SPMD) approaches, and the intriguing question of whether AI can write the foundations of AI itself. Understand why Large Language Models struggle with writing kernels and discover the strategic direction for PyTorch Foundation 2.0 as it continues evolving as the open language of artificial intelligence.
Syllabus
00:00 - Introduction
01:50 - An abridged history of PyTorch
05:48 - The PyTorch ecosystem in a nutshell
08:28 - PyTorch adoption in AI research
10:12 - Major challenges in the AI space
12:40 - A broader vision for PyTorch
14:30 - What’s next for PyTorch: Planet-scale training and inference
17:10 - The difficulty of innovating SPMD
22:05 - What’s next for PyTorch: Can AI write the foundations of AI?
25:10 - Why are LLMs so bad at writing kernels?
28:49 - Closing thoughts and pointers
Taught by
Weights & Biases