Coursera Flash Sale
40% Off Coursera Plus for 3 Months!
Grab it
Explore a thought-provoking conference talk from the CMSA Conference on Mathematics in Science that delves into the mathematical foundations of machine learning, quantum machine learning, and dynamics. Discover how deep neural networks (DNN) operate through stretching and folding mechanisms similar to the logistic map and Smale horseshoe, while examining the role of chaos in their expressivity and trainability. Learn about the potential connections between the Kolmogorov Arnold representation theorem and machine learning systems, and investigate the presence of linear maps in large language models. Consider the fascinating parallels between emergent tensor structures in highly trained maps and those found at local minima of loss functions in high-energy physics. Through this 80-minute presentation, gain insights into how mathematical concepts can advance our understanding and development of machine learning technologies.