Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

Theory of Deep Learning - Where Next?

Institute for Advanced Study via YouTube

Overview

Coursera Flash Sale
40% Off Coursera Plus for 3 Months!
Grab it
Explore fundamental theoretical questions in deep learning through this comprehensive workshop featuring leading researchers examining optimization landscapes, generalization bounds, and representational power of neural networks. Delve into cutting-edge research on whether optimization provides the right framework for understanding deep learning, investigate emergent linguistic structures in contextual word representations, and analyze landscape connectivity of low-cost solutions in multilayer networks. Examine critical issues including GAN optimization through competitive gradient descent, information-theoretic generalization bounds, and PAC-Bayesian approaches to understanding generalization. Learn about overcoming dimensionality curses and mode collapse, energy-based representation learning approaches, and large deviation principles for neural networks. Investigate the modern perspective on connections between neural networks and kernels, explore lessons from classical statistics applied to modern machine learning, and understand inductive biases introduced by dropout. Discover insights into interpreting deep neural networks, kernel and rich regimes in deep learning, and provably efficient reinforcement learning with linear function approximation. Analyze the statistical mechanics of deep learning, representational power of graph neural networks, and causal analysis approaches to generalization, while exploring explicit regularizer design for deep models through presentations from renowned experts including Sanjeev Arora, Yann LeCun, and other leading theorists in the field.

Syllabus

Is Optimization the Right Language to Understand Deep Learning? - Sanjeev Arora
Emergent linguistic structure in deep contextual neural word representations - Chris Manning
Explaining Landscape Connectivity of Low-cost Solutions for Multilayer Nets - Rong Ge
Fixing GAN optimization through competitive gradient descent - Anima Anandkumar
Tightening information-theoretic generalization bounds with data-dependent estimate... - Daniel Roy
Spotlight Talks - Amir Asadi, Dimitris Kalimeris
PAC-Bayesian approaches to understanding generalization in deep learning - Gintare Dziugaite
Overcoming the Curse of Dimensionality and Mode Collapse - Ke Li
Are All Features Created Equal? - Aleksander Madry
Energy-based Approaches to Representation Learning - Yann LeCun
On Large Deviation Principles for Large Neural Networks - Joan Bruna
Neural Models for Speech and Language: Successes, Challenges, and the... - Michael Collins
Spotlight Talks - Various
On the Connection between Neural Networks and Kernels: a Modern Perspective -Simon Du
From Classical Statistics to Modern ML: the Lessons of Deep Learning - Mikhail Belkin
Spotlight Talks - Various
Towards a theoretical foundation of neural networks - Jason Lee
Panel Session - Various
Learning Representations Using Causal Invariance - Leon Bottou
Understanding the inductive bias due to dropout - Raman Arora
Interpreting Deep Neural Networks - Bin Yu
Kernel and Rich Regimes in Deep Learning - Nati Srebro
Spotlight Talks Pt1: Jiaoyang Huang, Arjun Nitin Bhagoji, Rosemary Ke
Spotlight Talks Pt2 - Sebastian Goldt, Akshay Rangamani, Omar Shehab, Or Sharir
Provably Efficient Reinforcement Learning with Linear Function Approximation - Chi Jin
Reinforcement Learning, Deep Learning,and the Role of Policy Gradient Methods - Sham Kakade
Statistical mechanics of deep learning - Surya Ganguli
Representational Power of Graph Neural Networks - Stefanie Jegelka
Spotlight Talks Pt1 - Zhiyuan Li, John Zarka, Stanislav Fort
Toward a Causal Analysis of Generalization in Deep Learning - Behnam Neyshabur
Spotlight Talks Pt2 - Zhifeng Kong, Daniel Paul Kunin, Omar Montasser
Designing explicit regularizers for deep models? - Tengyu Ma

Taught by

Institute for Advanced Study

Reviews

Start your review of Theory of Deep Learning - Where Next?

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.