Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

Neural Networks and VC-Dimension Tutorial

Kolmogorov-Seminar via YouTube

Overview

Coursera Flash Sale
40% Off Coursera Plus for 3 Months!
Grab it
Explore the theoretical foundations of neural network computational capacity through the lens of VC-dimension theory in this comprehensive tutorial. Delve into the mathematical analysis of networks where each neuron computes the sign of linear combinations of Boolean inputs, and discover how to establish upper bounds for predicates on R^n that such networks can compute. Learn about the intersection of computational complexity theory and machine learning through rigorous mathematical frameworks, examining how VC-dimension provides crucial insights into the learning capabilities and limitations of neural architectures. Gain deep understanding of descriptional complexity principles as they apply to neural computation, building upon foundational work in computational complexity theory established by Kolmogorov and continued through modern research in machine learning theory.

Syllabus

Neural networks and VC-dimension (tutorial, Alexander Kozachinskiy) 2025-09-08

Taught by

Kolmogorov-Seminar

Reviews

Start your review of Neural Networks and VC-Dimension Tutorial

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.