Probabilistic Inference Using Contraction of Tensor Networks
The Julia Programming Language via YouTube
NY State-Licensed Certificates in Design, Coding & AI — Online
Finance Certifications Goldman Sachs & Amazon Teams Trust
Overview
Google, IBM & Meta Certificates — All 10,000+ Courses at 40% Off
One annual plan covers every course and certificate on Coursera. 40% off for a limited time.
Get Full Access
Explore probabilistic inference using tensor network contraction in this 29-minute conference talk from JuliaCon 2024. Dive into the world of reasoning under uncertainty and learn how TensorInference.jl, a Julia package, combines probabilistic graphical models (PGMs) with tensor networks to enhance performance in complex probabilistic inference tasks. Discover the challenges of exact and approximate inference methods, and understand how tensor networks offer a powerful solution for representing complex system states. Gain insights into optimizing contraction sequences, leveraging differentiable programming, and utilizing advanced contraction methods like TreeSA, SABipartite, KaHyParBipartite, and GreedyMethod. Learn about the package's support for generic element types, hyper-optimized contraction order settings, and integration with BLAS routines and GPU technology for improved efficiency. Explore applications in AI, medical diagnosis, computer vision, and natural language processing while understanding the potential of exact methods in probabilistic inference.
Syllabus
Probabilistic inference using contraction of tensor networks | Roa-Villescas | JuliaCon 2024
Taught by
The Julia Programming Language