You'll explore the powerful Hugging Face ecosystem and master different pre-trained Transformer architectures, understanding the specific characteristics of BERT, GPT-2, and T5 models along with their tokenizers and use cases.
Overview
Syllabus
- Unit 1: Introduction to Hugging Face
- Your First Sentiment Analysis Pipeline
- Building Your First Text Generator
- Building Question Answering Pipeline
- Working with Core Transformer Components
- Unit 2: BERT Encoder Architecture
- Exploring BERT Tokenizer Fundamentals
- BERT Fills in the Blanks
- Context Changes Everything in BERT
- Building Your First BERT Classifier
- Unit 3: GPT-2 Autoregressive Generation
- GPT-2 Tokenization Pattern Analysis
- Fixing GPT-2 Text Generation Issues
- Inside GPT-2 Decision Making
- Creative versus Predictable Text Generation
- Unit 4: T5 Encoder Decoder Mastery
- T5 Tokenizer Bug Hunt
- Building Universal Text Processor
- Perfecting T5 Text Generation
- T5 Model Architecture Explorer
- Building Your T5 Task Handler