Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

Spontaneous Kolmogorov-Arnold Geometry in Vanilla Fully-Connected Neural Networks

Harvard CMSA via YouTube

Overview

Coursera Spring Sale
40% Off Coursera Plus Annual!
Grab it
Explore how Kolmogorov-Arnold (KA) geometry spontaneously emerges in conventional neural network training through this 49-minute conference talk. Examine the distinctive local geometry and "texture" of universal functions characterized by the Jacobian matrix J(x) as it varies over data, and discover how this KA geometry naturally develops during the optimization of vanilla single hidden layer fully-connected neural networks (MLPs). Learn to quantify KA geometry through statistical properties of exterior powers of J(x), including zero row analysis and minor statistics that measure scale and axis alignment. Understand the resulting phase diagram in the space of function complexity and model hyperparameters where KA geometry occurs, gaining insights into how neural networks organically prepare input data for downstream processing. Investigate the potential for accelerating learning through strategic interventions in network hyperparameters based on understanding KA geometry emergence, representing the complementary perspective to engineered KA-Networks (KANs) by observing natural KA development in shallow MLPs rather than explicitly designing it into network architecture.

Syllabus

Michael Mulligan | Spontaneous Kolmogorov-Arnold Geometry in Vanilla Fully-Connected Neural Networks

Taught by

Harvard CMSA

Reviews

Start your review of Spontaneous Kolmogorov-Arnold Geometry in Vanilla Fully-Connected Neural Networks

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.