Coursera Flash Sale
40% Off Coursera Plus for 3 Months!
Grab it
This course covers the development of natural language processing (NLP), starting with basic concepts and moving to modern transformer architectures. You will learn about attention mechanisms and their impact on language modeling, as well as the details of transformer models, including scaled dot product attention and multi-headed attention. The course includes practical exercises in transfer learning using pre-trained models such as BERT and GPT, with instruction on fine-tuning these models for specific NLP tasks in PyTorch. By the end, you will understand the theory behind current NLP models and gain practical experience in applying them to real-world problems.