The Most Addictive Python and SQL Courses
Google AI Professional Certificate - Learn AI Skills That Get You Hired
Overview
Coursera Flash Sale
40% Off Coursera Plus for 3 Months!
Grab it
Learn the transformer architecture through a comprehensive video series that breaks down the groundbreaking "Attention is All You Need" paper into digestible segments. Master the fundamental concepts starting with self-attention mechanisms and progressing through encoder and decoder components, while understanding how transformers compare to traditional CNNs and RNNs. Explore the complete mathematical foundations behind self-attention equations and examine the intricacies of masked self-attention and encoder-decoder attention mechanisms. Discover how this revolutionary architecture has transformed natural language processing and enabled breakthrough models like BERT and GPT-3, while also finding applications beyond NLP in areas such as computer vision and protein modeling. Gain the theoretical foundation needed to understand current transformer-based research and applications across multiple domains, with additional coverage of BERT's transfer learning approach for NLP tasks.
Syllabus
Transformers - Part 1 - Self-attention: an introduction
Transformers - Part 2 - Self attention complete equations
Transformers - Part 3 - Encoder
Transformers - Part 4 - Encoder remarks
Transformers - Part 5 - Transformers vs CNNs and RNNS
Transformer - Part 6 - Decoder (1): testing and training
Transformers - Part 7 - Decoder (2): masked self-attention
Transformer - Part 8 - Decoder (3): Encoder-decoder self-attention
BERT: transfer learning for NLP
Taught by
Lennart Svensson