Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

Low-Rank Models and Applications

Fields Institute via YouTube

Overview

Coursera Flash Sale
40% Off Coursera Plus for 3 Months!
Grab it
Explore cutting-edge research in low-rank modeling through this comprehensive mini-symposium featuring 15 specialized talks on advanced mathematical and computational techniques. Delve into diverse applications including linear dynamical systems with Hankel nuclear norm regularization, over-parameterized tensor decomposition beyond lazy training frameworks, and generalized compressed sensing with novel measurement matrices. Examine dual principal component pursuit methods, multi-channel linear convolutional networks from a function space perspective, and data-driven approaches to dynamic interpolation and approximation. Investigate tomographic imaging under model uncertainty, rigidity theory for Gaussian graphical models, and non-separable relaxations of rank penalty classes. Learn about robust low-rank matrix completion using alternating manifold proximal gradient continuation methods, computational barriers in estimation from low-degree polynomials, and principal component analysis for high-dimensional heteroscedastic data. Discover connections between PCA, double descent phenomena, and Gaussian processes, while exploring sample optimal algorithms for low-rank approximation of positive semidefinite and distance matrices, concluding with techniques for imputing missing data using low-rank Gaussian copula models.

Syllabus

Learning Linear Dynamical Systems with Hankel Nuclear Norm Regularization
Beyond Lazy Training for Over-parameterized Tensor Decomposition
A family of measurement matrices for generalized compressed sensing
Dual Principal Component Pursuit
Function space view of Multi-Channel Linear Convolutional Networks with Bounded Weight Norm
Data-driven dynamic interpolation and approximation
Tomographic Imaging with Model Uncertainty
Rigidity theory for Gaussian graphical models: the maximum likelihood threshold of a graph
Non-Separable Relaxations of a Class of Rank Penalties
Robust Low-Rank Matrix Completion via an Alternating Manifold Proximal Gradient Continuation Method
Computational Barriers to Estimation from Low-Degree Polynomials
PCA for High-Dimensional Heteroscedastic Data
PCA, Double Descent, and Gaussian Processes
Sample Optimal Algorithms for Low Rank Approximation of PSD and Distance Matrices
Imputing Missing Data with the Low-Rank Gaussian Copula

Taught by

Fields Institute

Reviews

Start your review of Low-Rank Models and Applications

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.