Mathematics for Deep Neural Networks: Statistical Theory for Deep ReLU Networks - Lecture 4
Georgia Tech Research via YouTube
MIT Sloan AI Adoption: Build a Playbook That Drives Real Business ROI
Google, IBM & Meta Certificates — 40% Off for a Limited Time
Overview
Google, IBM & Meta Certificates — All 10,000+ Courses at 40% Off
One annual plan covers every course and certificate on Coursera. 40% off for a limited time.
Get Full Access
Explore the mathematical foundations of deep neural networks in this lecture from the TRIAD Distinguished Lecture Series. Delve into the statistical theory behind deep ReLU networks as Johannes Schmidt-Hieber presents the fourth installment of his five-part series. Examine the specific properties of the ReLU activation function, focusing on its relationship to skipping connections and efficient polynomial approximation. Gain insights into how risk bounds can be derived for sparsely connected networks. Enhance your understanding of the theoretical underpinnings that support recent advancements in estimating the risk associated with deep ReLU networks.
Syllabus
TRIAD Distinguished Lecture Series | Johannes Schmidt-Hieber Lecture 4 (of 5)
Taught by
Georgia Tech Research