Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

Understanding Deep Learning Models via Interaction Importance

Simons Foundation via YouTube

Overview

Coursera Flash Sale
40% Off Coursera Plus for 3 Months!
Grab it
Explore deep learning model interpretability through interaction importance analysis in this 51-minute conference talk from the 2025 Mathematical and Scientific Foundations of Deep Learning Annual Meeting. Delve into advanced methodologies for understanding how deep neural networks make decisions by examining the interactions between different features and components within these complex models. Learn about cutting-edge approaches to model explainability that go beyond traditional feature importance measures, focusing specifically on how different elements of deep learning architectures interact to produce final outputs. Discover practical techniques for analyzing and visualizing these interactions to gain deeper insights into model behavior, which is crucial for building trust and reliability in deep learning applications across various domains. Gain valuable knowledge about the mathematical foundations underlying interaction importance measures and their applications in making deep learning models more transparent and interpretable for both researchers and practitioners in the field.

Syllabus

Bin Yu — Understanding Deep Learning Models via Interaction Importance (Sept. 25, 2025)

Taught by

Simons Foundation

Reviews

Start your review of Understanding Deep Learning Models via Interaction Importance

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.