Stanford Seminar 2022 - Self Attention and Non-Parametric Transformers
Stanford University via YouTube
Free courses from frontend to fullstack and AI
Learn the Skills Netflix, Meta, and Capital One Actually Hire For
Overview
Google, IBM & Meta Certificates — All 10,000+ Courses at 40% Off
One annual plan covers every course and certificate on Coursera. 40% off for a limited time.
Get Full Access
Explore the origins and intuitions of Transformers, followed by an in-depth discussion on Non-Parametric Transformers (NPTs) in this Stanford seminar. Begin with a 15-minute overview of Transformer fundamentals by Aidan, a PhD student at Oxford and Cohere co-founder. Then, delve into the recently NeurIPs-accepted NPTs with Neil and Jannik, both PhD students at the University of Oxford. Gain insights into massive neural networks, Bayesian Deep Learning, Active Learning, and the application of non-parametric models with Transformers. Learn from these emerging researchers as they share their expertise in building and implementing advanced AI models.
Syllabus
CS25 I Stanford Seminar 2022 - Self Attention and Non-parametric transformers (NPTs)
Taught by
Stanford Online