Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

A Local Graph Limits Perspective on Sampling-Based GNNs

Simons Institute via YouTube

Overview

Coursera Flash Sale
40% Off Coursera Plus for 3 Months!
Grab it
Explore a theoretical framework for understanding sampling-based Graph Neural Networks (GNNs) through the lens of local graph limits in this 39-minute conference talk. Survey various sampling approaches including neighborhood, layer-wise, cluster, and subgraph sampling methods that have made GNNs practically scalable for modern applications. Discover how the Benjamini-Schramm local graph-limit perspective provides a unifying theoretical tool for analyzing sampling-based GNNs. Learn about rigorous proofs demonstrating that, under mild assumptions, parameters learned from training GNNs on small, fixed-size samples of large input graphs remain within an ε-neighborhood of those obtained by training on the entire graph. Examine derived bounds on the number of samples, subgraph size, and training steps required for effective learning. Gain insights into the principled explanation for why training on subgraph samples works empirically, connecting to established notions of transferability in the literature and bridging the gap between practical sampling methods and theoretical understanding in graph neural network research.

Syllabus

A Local Graph Limits Perspective on Sampling-Based GNNs

Taught by

Simons Institute

Reviews

Start your review of A Local Graph Limits Perspective on Sampling-Based GNNs

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.