AI Adoption - Drive Business Value and Organizational Impact
Our career paths help you become job ready faster
Overview
Coursera Flash Sale
40% Off Coursera Plus for 3 Months!
Grab it
Explore the evolution and future of deep learning hardware in this comprehensive lecture by NVIDIA's Chief Scientist Dr. Bill Dally. Discover how the current AI revolution, including generative AI systems like ChatGPT, has been enabled by powerful computing hardware, particularly GPUs that made deep learning algorithms from the 1980s finally practical. Learn about the exponential growth in demand for training operations—increasing 10 million times over the past decade and currently growing 16 times per year—and understand how hardware performance now gates advances in deep learning. Examine the specific challenges posed by the autoregressive nature of large language model inference, which places heavy demands on latency and memory bandwidth, and explore how the trend toward agentic AI systems creates additional latency requirements. Gain insights from Dr. Dally's extensive experience in computer architecture, including his work on innovative systems like the J-Machine, M-Machine, Imagine processor, and Merrimac streaming supercomputer that pioneered concepts like stream processing and GPU computing. Understand how deep learning hardware is adapting to meet the latest computational challenges in artificial intelligence and what trends are shaping the future of AI acceleration.
Syllabus
Bill Dally - Trends in Deep Learning Hardware
Taught by
UC Berkeley EECS