Tensors.jl - Efficient Tensor Operations and Automatic Differentiation in Julia
The Julia Programming Language via YouTube
Overview
Coursera Flash Sale
40% Off Coursera Plus for 3 Months!
Grab it
Explore Tensors.jl, a core Julia package for efficient tensor operations in scientific computing, through this 24-minute conference presentation from FerriteCon 2025. Learn how Tensors.jl provides high-performance implementations for both symmetric and non-symmetric tensor computations that closely map to mathematical notation, offering significant improvements in clarity and performance over traditional approaches like Voigt notation. Discover the package's internal workings and understand how it supports automatic differentiation, enabling efficient differentiation of tensorial operations without requiring hand-derived derivatives. Gain insights into why Tensors.jl is recommended for use within Ferrite.jl and user-written Julia code, particularly in computational science applications involving continuum mechanics and finite element methods. Understand how automatic differentiation support allows many tensor functions written in pure Julia to be differentiated automatically, reducing errors and accelerating development of models involving gradients or sensitivity analysis. Master practical applications of Tensors.jl in high-performance scientific computing, finite element modeling, and advanced numerical methods while learning how it integrates seamlessly with the broader Julia ecosystem for enhanced tensor computation workflows.
Syllabus
Welcome!
Help us add time stamps or captions to this video! See the description for details.
Taught by
The Julia Programming Language