Overview
Coursera Flash Sale
40% Off Coursera Plus for 3 Months!
Grab it
Explore the theoretical foundations of neural network computational capacity through the lens of VC-dimension theory in this comprehensive tutorial. Delve into the mathematical analysis of networks where each neuron computes the sign of linear combinations of Boolean inputs, and discover how to establish upper bounds for predicates on R^n that such networks can compute. Learn about the intersection of computational complexity theory and machine learning through rigorous mathematical frameworks, examining how VC-dimension provides crucial insights into the learning capabilities and limitations of neural architectures. Gain deep understanding of descriptional complexity principles as they apply to neural computation, building upon foundational work in computational complexity theory established by Kolmogorov and continued through modern research in machine learning theory.
Syllabus
Neural networks and VC-dimension (tutorial, Alexander Kozachinskiy) 2025-09-08
Taught by
Kolmogorov-Seminar