Our career paths help you become job ready faster
35% Off Finance Skills That Get You Hired - Code CFI35
Overview
Coursera Flash Sale
40% Off Coursera Plus for 3 Months!
Grab it
Explore advanced theoretical foundations of deep learning through this graduate-level lecture series taught by Eli Grigsby from Boston College at Harvard's Center of Mathematical Sciences and Applications. Delve into sophisticated mathematical concepts underlying modern deep learning architectures and algorithms across eleven comprehensive sessions. Examine rigorous theoretical frameworks that govern neural network behavior, optimization landscapes, generalization bounds, and convergence properties. Investigate cutting-edge research topics including approximation theory for deep networks, statistical learning theory applications, and mathematical analysis of gradient-based optimization methods. Study the interplay between network architecture, expressivity, and learnability through formal mathematical lenses. Analyze theoretical guarantees for deep learning performance and explore connections between deep learning and other mathematical disciplines such as functional analysis, probability theory, and differential geometry. Engage with current research frontiers in understanding why and how deep networks work from a theoretical perspective, preparing you for advanced research in machine learning theory and related mathematical fields.
Syllabus
Deep Learning 9/10/2024
Deep Learning 9/12/2024
Deep Learning 9/17/2024
Deep Learning 9/24/2024
Deep Learning 9/26/24
Deep Learning 10/1/2024
Deep Learning 10/8/24
Deep Learning 9/19/24
Deep Learning 10/11/24
Deep Learning 10/15/24
Deep Learning 10/22/24
Taught by
Harvard CMSA