Start speaking a new language. It’s just 3 weeks away.
Free courses from frontend to fullstack and AI
Overview
Syllabus
Learning Linear Dynamical Systems with Hankel Nuclear Norm Regularization
Beyond Lazy Training for Over-parameterized Tensor Decomposition
A family of measurement matrices for generalized compressed sensing
Dual Principal Component Pursuit
Function space view of Multi-Channel Linear Convolutional Networks with Bounded Weight Norm
Data-driven dynamic interpolation and approximation
Tomographic Imaging with Model Uncertainty
Rigidity theory for Gaussian graphical models: the maximum likelihood threshold of a graph
Non-Separable Relaxations of a Class of Rank Penalties
Robust Low-Rank Matrix Completion via an Alternating Manifold Proximal Gradient Continuation Method
Computational Barriers to Estimation from Low-Degree Polynomials
PCA for High-Dimensional Heteroscedastic Data
PCA, Double Descent, and Gaussian Processes
Sample Optimal Algorithms for Low Rank Approximation of PSD and Distance Matrices
Imputing Missing Data with the Low-Rank Gaussian Copula
Taught by
Fields Institute