NYU Deep Learning - Spring 2020
Overview
Syllabus
Week 1 – Lecture: History, motivation, and evolution of Deep Learning
Week 1 – Practicum: Classification, linear algebra, and visualisation
Week 2 – Lecture: Stochastic gradient descent and backpropagation
Week 2 – Practicum: Training a neural network
Week 3 – Lecture: Convolutional neural networks
Week 3 – Practicum: Natural signals properties and CNNs
Week 4 – Practicum: Listening to convolutions
Week 5 – Lecture: Optimisation
Week 5 – Practicum: 1D multi-channel convolution and autograd
Week 6 – Lecture: CNN applications, RNN, and attention
Week 6 – Practicum: RNN and LSTM architectures
Week 7 – Lecture: Energy based models and self-supervised learning
Week 7 – Practicum: Under- and over-complete autoencoders
Week 8 – Lecture: Contrastive methods and regularised latent variable models
Week 8 – Practicum: Variational autoencoders
Week 9 – Lecture: Group sparsity, world model, and generative adversarial networks (GANs)
Week 9 – Practicum: (Energy-based) Generative adversarial networks
Week 10 – Lecture: Self-supervised learning (SSL) in computer vision (CV)
Week 10 – Practicum: The Truck Backer-Upper
Week 11 – Lecture: PyTorch activation and loss functions
Week 11 – Practicum: Prediction and Policy learning Under Uncertainty (PPUU)
Week 12 – Lecture: Deep Learning for Natural Language Processing (NLP)
Week 12 – Practicum: Attention and the Transformer
Week 13 – Lecture: Graph Convolutional Networks (GCNs)
Week 13 – Practicum: Graph Convolutional Neural Networks (GCN)
Week 14 – Lecture: Structured prediction with energy based models
Week 14 – Practicum: Overfitting and regularization, and Bayesian neural nets
Week 15 – Practicum part A: Inference for latent variable energy based models (EBMs)
Week 15 – Practicum part B: Training latent variable energy based models (EBMs)
Matrix multiplication, signals, and convolutions
Supervised and self-supervised transfer learning (with PyTorch Lightning)
Taught by
Alfredo Canziani