The Perfect Gift: Any Class, Never Expires
Gain a Splash of New Skills - Coursera+ Annual Just ₹7,999
Overview
Coursera Flash Sale
40% Off Coursera Plus for 3 Months!
Grab it
Dive deeper into neural network pruning and sparsity in this lecture from MIT's course on TinyML and Efficient Deep Learning Computing. Explore advanced pruning techniques, including how to select optimal pruning ratios for each layer and fine-tune sparse neural networks. Discover the lottery ticket hypothesis and learn about system support for sparsity. Gain valuable insights into making deep learning models more efficient and deployable on resource-constrained devices. Access accompanying slides and additional course materials to enhance your understanding of pruning, sensitivity scans, automatic pruning, and the AMC algorithm.
Syllabus
Lecture 04 - Pruning and Sparsity (Part II) | MIT 6.S965
Taught by
MIT HAN Lab