Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

Sparsest ReLU Neural Networks

Paul G. Allen School via YouTube

Overview

Coursera Flash Sale
40% Off Coursera Plus for 3 Months!
Grab it
Explore the mathematical foundations of sparse ReLU neural networks in this 14-minute conference talk from the IFDS Workshop. Learn about the theoretical principles behind creating the sparsest possible ReLU (Rectified Linear Unit) neural networks and understand how sparsity can be optimized in neural network architectures. Discover the mathematical approaches and computational techniques used to minimize network complexity while maintaining performance, as presented by Julia Blair Nakhleh from the University of Wisconsin-Madison. Gain insights into the intersection of optimization theory and neural network design, particularly focusing on how ReLU activation functions can be leveraged to achieve maximum sparsity in network structures.

Syllabus

IFDS Workshop Short Talks–Sparsest ReLU Neural Networks

Taught by

Paul G. Allen School

Reviews

Start your review of Sparsest ReLU Neural Networks

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.