PowerBI Data Analyst - Create visualizations and dashboards from scratch
Earn Your Business Degree, Tuition-Free, 100% Online!
Overview
Coursera Flash Sale
40% Off Coursera Plus for 3 Months!
Grab it
Explore cutting-edge research at the intersection of deep learning and dynamical systems through this comprehensive symposium featuring seven expert presentations. Delve into the frequency principle governing deep learning behavior, discover methods for computing Lyapunov functions using neural networks while circumventing dimensional complexity, and examine local conformal autoencoders for advanced data representation. Investigate philosophical connections between Plato's theory of forms and the continuous limits of artificial neural networks, analyze the fundamental tradeoffs between recurrence and attention mechanisms in self-attentive architectures, and understand how control theory principles can ensure AI safety. Learn about innovative applications of neural networks for solving partial differential equations, gaining insights into how machine learning techniques can address complex mathematical and theoretical challenges in dynamical systems.
Syllabus
Frequency Principle in Deep Learning
Computing Lyapunov functions via neural networks avoiding the curse of dimensionality
Local Conformal AutoEncoder
Do ideas have shape? Plato's theory of forms as the continuous limit of artificial neutral networks
Recurrence vs Attention: untangling tradeoffs in self-attentive neural networks
Safe AI with control theory
Neural Networks for Solving PDEs
Taught by
Fields Institute