Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

MIT OpenCourseWare

Matrix Calculus for Machine Learning and Beyond - IAP 2023

MIT OpenCourseWare via YouTube

Overview

Coursera Flash Sale
40% Off Coursera Plus for 3 Months!
Grab it
Explore advanced mathematical techniques essential for modern machine learning and optimization through this comprehensive course that bridges the gap between traditional calculus and matrix-based computations. Master matrix calculus by learning to think of matrices holistically rather than as arrays of scalars, and discover how to generalize and compute derivatives of complex matrix factorizations and operations. Develop skills in vectorization of matrix functions, Kronecker products, and Jacobians while understanding how differentiation formulas must be reimagined for large-scale computing applications. Delve into gradients and inner products in various vector spaces, nonlinear root finding, and adjoint gradient methods essential for optimization problems. Study the derivatives of matrix determinants and inverses, forward automatic differentiation using dual numbers, and differentiation on computational graphs. Examine advanced topics including adjoint differentiation of ODE solutions, calculus of variations, derivatives of random functions, second derivatives and Hessian matrices, derivatives of eigenproblems, and automatic differentiation techniques. Gain practical understanding of how these mathematical foundations support machine learning algorithms, large-scale optimization, and other modern computational applications through lectures taught by MIT faculty Alan Edelman and Steven G. Johnson.

Syllabus

Lecture 1 Part 1: Introduction and Motivation
Lecture 1 Part 2: Derivatives as Linear Operators
Lecture 2 Part 1: Derivatives in Higher Dimensions: Jacobians and Matrix Functions
Lecture 2 Part 2: Vectorization of Matrix Functions
Lecture 3 Part 1: Kronecker Products and Jacobians
Lecture 3 Part 2: Finite-Difference Approximations
Lecture 4 Part 1: Gradients and Inner Products in Other Vector Spaces
Lecture 4 Part 2: Nonlinear Root Finding, Optimization, and Adjoint Gradient Methods
Lecture 5 Part 1: Derivative of Matrix Determinant and Inverse
Lecture 5 Part 2: Forward Automatic Differentiation via Dual Numbers
Lecture 5 Part 3: Differentiation on Computational Graphs
Lecture 6 Part 1: Adjoint Differentiation of ODE Solutions
Lecture 6 Part 2: Calculus of Variations and Gradients of Functionals
Lecture 7 Part 1: Derivatives of Random Functions
Lecture 7 Part 2: Second Derivatives, Bilinear Forms, and Hessian Matrices
Lecture 8 Part 1: Derivatives of Eigenproblems
Lecture 8 Part 2: Automatic Differentiation on Computational Graphs

Taught by

MIT OpenCourseWare

Reviews

Start your review of Matrix Calculus for Machine Learning and Beyond - IAP 2023

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.