The Perceptron as a Roadmap - From Neuron Structure to Artificial Neural Networks
Schmid College, Chapman University via YouTube
You’re only 3 weeks away from a new language
Most AI Pilots Fail to Scale. MIT Sloan Teaches You Why — and How to Fix It
Overview
Google, IBM & Meta Certificates — All 10,000+ Courses at 40% Off
One annual plan covers every course and certificate on Coursera. 40% off for a limited time.
Get Full Access
Explore the evolution of artificial neural networks in this graduate-level colloquium lecture by Professor Daniel Alpay. Trace the development from the discovery of neuron structure to modern machine learning algorithms, using the perceptron as a central focus. Delve into the historical context of the perceptron, created by psychologist Frank Rosenblatt in the 1960s for image classification, and its foundation in the McCullogh and Pitts neuron model from 1943. Examine the perceptron algorithm as one of the earliest machine learning techniques and its influence on subsequent artificial neural network structures. Investigate connections to Hopfield networks, associative memories, and function approximation. Discover unexpected links to mathematicians like Agmon, Schoenberg, and Wiener in this comprehensive exploration of the intersection between neuroscience, computer science, and mathematics.
Syllabus
Daniel Alpay: The Perceptron as a Roadmap (Graduate Colloquium in Math, Philosophy and Physics)
Taught by
Schmid College, Chapman University