Architectural Constraints on Recurrent Network Dynamics
Institute for Pure & Applied Mathematics (IPAM) via YouTube
Become an AI & ML Engineer with Cal Poly EPaCE — IBM-Certified Training
Free courses from frontend to fullstack and AI
Overview
Google, IBM & Meta Certificates — All 10,000+ Courses at 40% Off
One annual plan covers every course and certificate on Coursera. 40% off for a limited time.
Get Full Access
Explore the intricate relationship between network architecture and dynamics in recurrent neural networks through this illuminating lecture by Carina Curto from Penn State University. Delve into the study of threshold-linear networks as simplified models that exhibit a full range of nonlinear behaviors, including fixed point attractors, limit cycles, quasiperiodic attractors, and chaos. Examine how connectomes both enable and constrain neural computation, and investigate the bifurcation theory as a function of synaptic weights and neuromodulation. Gain insights into the mathematical underpinnings of these constraints through the lens of combinatorial geometry and hyperplane arrangements associated with the model. Recorded at IPAM's Mathematical Approaches for Connectome Analysis Workshop, this talk offers a deep dive into the architectural constraints on recurrent network dynamics, providing valuable knowledge for researchers and students in computational neuroscience and applied mathematics.
Syllabus
Carina Curto - Architectural constraints on recurrent network dynamics - IPAM at UCLA
Taught by
Institute for Pure & Applied Mathematics (IPAM)