Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

What is the Transformers' Context Window in Deep Learning and How to Make it Long

Yacine Mahdid via YouTube

Overview

Coursera Flash Sale
40% Off Coursera Plus for 3 Months!
Grab it
Explore the concept of context windows in transformer architecture and learn techniques to extend them in this 27-minute video tutorial. Discover why larger context is beneficial through examples like retrieval tests, needle-in-a-haystack challenges, and book translation tasks. Review attention calculation fundamentals and position encoding before diving into the challenges of increasing context length. Master various strategies for extending context windows including improved positional encodings, optimized attention calculations with Flash Attention, sparse attention mechanisms, low-rank decomposition, chunking, and linear component approaches. Examine real-world implementations in cutting-edge models like LLama 4 (with its 10M token context) and Gemini 2.5 (1M tokens). The tutorial includes comprehensive references to research papers and additional resources for deeper understanding of long-context transformer models.

Syllabus

- Introduction: 0:00
- Why more context is good: 0:33
- R1 longer context: 1:06
- A little retrieval test: 1:56
- Needle-in-a-haystack: 2:40
- Multi-Round Needle-in-a-haystack: 3:38
- Machine Translation from One Book MTOB: 4:52
- Attention Calculation Recap: 6:16
- How to encode positions: 8:51
- Issue with increasing context: 10:07
- How to extend context: 11:26
- Fixing positional encoding: 11:45
- Fixing Attention Calculation: 13:21
- Flash Attention: 13:55
- Sparse Attention: 14:52
- Low-Rank Decomposition: 18:14
- Chunking: 19:51
- Other type of strategy using linear components: 21:44
- LLama 4 changes: 24:12
- Google Long Context Team Nikolay Savinov: 25:33
- see you folks! : 26:50

Taught by

Yacine Mahdid

Reviews

Start your review of What is the Transformers' Context Window in Deep Learning and How to Make it Long

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.