Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

Leveraging Sparsity to Accelerate Automatic Differentiation

The Julia Programming Language via YouTube

Overview

Coursera Flash Sale
40% Off Coursera Plus for 3 Months!
Grab it
Explore how to dramatically accelerate automatic differentiation by leveraging the inherent sparsity of Jacobians and Hessians in this 30-minute conference talk from JuliaCon Local Paris 2025. Learn how these matrices, often considered computationally expensive to calculate, play vital roles in scientific computing and machine learning applications ranging from optimization to probabilistic modeling. Discover how building on top of DifferentiationInterface.jl enables bringing Automatic Sparse Differentiation to all major Julia AD backends, including ForwardDiff and Enzyme. Gain insights into practical techniques for exploiting sparsity patterns to significantly reduce computational costs while maintaining accuracy in gradient and Hessian computations. Understand the implementation details and performance benefits of sparse automatic differentiation methods within the Julia ecosystem, and see how these advances can accelerate your own scientific computing and machine learning workflows.

Syllabus

Leveraging Sparsity to Accelerate Automatic Differentiation | Hill | Paris 2025

Taught by

The Julia Programming Language

Reviews

Start your review of Leveraging Sparsity to Accelerate Automatic Differentiation

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.