Learning Paradigms for Neural Networks: The Locally Backpropagated Forward-Forward Algorithm
Inside Livermore Lab via YouTube
AI Engineer - Learn how to integrate AI into software applications
The Most Addictive Python and SQL Courses
Overview
Coursera Flash Sale
40% Off Coursera Plus for 3 Months!
Grab it
Explore a cutting-edge approach to neural network training in this 57-minute talk by Fabio Giampaolo from the University of Naples Federico II. Delve into the Locally Backpropagated Forward Forward training strategy, a novel method combining the effectiveness of backpropagation with the appealing attributes of the Forward-Forward algorithm. Understand how this innovative technique addresses limitations of traditional methods, particularly in integrating Deep Learning strategies within complex frameworks dealing with physics-related problems. Learn about challenges such as incorporating non-differentiable components in neural architectures and implementing distributed learning on heterogeneous devices. Gain insights into the potential of this approach to broaden the applicability of AI strategies in real-world situations, especially in contexts where conventional methods face limitations.
Syllabus
DDPS | Learning paradigms for neural networks: The locally backpropagated forward-forward algorithm
Taught by
Inside Livermore Lab