Randomized Algorithms for Rounding and Rank Compression in the Tensor Train Format
Institute for Pure & Applied Mathematics (IPAM) via YouTube
Learn the Skills Netflix, Meta, and Capital One Actually Hire For
Get 20% off all career paths from fullstack to AI
Overview
Google, IBM & Meta Certificates — All 10,000+ Courses at 40% Off
One annual plan covers every course and certificate on Coursera. 40% off for a limited time.
Get Full Access
Explore a 41-minute conference talk on randomized algorithms for rounding and rank compression in the Tensor Train format. Delve into Paul Cazeaux's presentation at IPAM's Many-body Quantum Systems via Classical and Quantum Computation Workshop. Discover how the Tensor-Train (TT) or Matrix-Product States (MPS) format provides a compact, low-rank representation for high-dimensional tensors, with applications in computing many-body ground states in spin models and quantum chemistry. Learn about a new suite of randomized algorithms designed for TT rounding, which preserve the format's integrity while offering significant computational advantages. Understand how these algorithms can achieve up to a 20× speedup, enhancing the performance of classical iterative Krylov methods like GMRES and Lanczos when applied to vectors in TT format. Gain insights into the comparative analysis of these randomized algorithms' empirical accuracy and computational efficiency against deterministic counterparts.
Syllabus
Paul Cazeaux - Randomized Algorithms for Rounding and Rank Compression in the Tensor Train Format
Taught by
Institute for Pure & Applied Mathematics (IPAM)