Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

Transformers Part 2

UofU Data Science via YouTube

Overview

Coursera Flash Sale
40% Off Coursera Plus for 3 Months!
Grab it
Dive into the second part of this comprehensive lecture on Transformers architecture, exploring advanced concepts and implementations in this 79-minute presentation from the University of Utah Data Science program. Build upon foundational knowledge of attention mechanisms and self-attention to examine deeper architectural components, training strategies, and practical applications of transformer models. Access accompanying slides to follow along with detailed explanations of multi-head attention, positional encoding, layer normalization, and feed-forward networks within the transformer framework. Explore how these components work together to enable powerful natural language processing capabilities and understand the mathematical foundations that make transformers effective for sequence-to-sequence tasks, language modeling, and various downstream applications in machine learning and artificial intelligence.

Syllabus

Transformers Part 2

Taught by

UofU Data Science

Reviews

Start your review of Transformers Part 2

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.