Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

Redundancy and Implicit Regularization in Neural Network Models

Fields Institute via YouTube

Overview

Coursera Flash Sale
40% Off Coursera Plus for 3 Months!
Grab it
Explore the fundamental question of why neural network training selects specific parameter representations when multiple equivalent configurations exist in this 55-minute conference talk. Examine the concept of functional redundancy in neural networks, where many distinct parameter settings can represent the same function, and investigate the geometric properties of parameter fibers—the sets of all configurations that realize a given function. Delve into two key model families: deep linear networks (DLNs) and feedforward ReLU networks, analyzing how training dynamics influence representation selection. Learn about ongoing research on balanced representations in deep linear networks and discover how implicit regularization mechanisms guide the optimization process toward particular solutions among functionally equivalent alternatives.

Syllabus

Redundancy and Implicit Regularization in Neural Network Models

Taught by

Fields Institute

Reviews

Start your review of Redundancy and Implicit Regularization in Neural Network Models

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.