Completed
– 5:29:46 Google’s Multilingual NMT Johnson et al., 2017
Class Central Classrooms beta
YouTube videos curated by Class Central.
Classroom Contents
Code 7 Landmark NLP Papers in PyTorch - Full Neural Machine Translation Course
Automatically move to the next video in the Classroom when playback concludes
- 1 – 0:01:06 Welcome
- 2 – 0:04:27 Intro to Atlas
- 3 – 0:09:25 Evolution of RNN
- 4 – 0:15:08 Evolution of Machine Translation
- 5 – 0:26:56 Machine Translation Techniques
- 6 – 0:34:28 Long Short-Term Memory Overview
- 7 – 0:52:36 Learning Phrase Representation using RNN Encoder–Decoder for SMT
- 8 – 1:00:46 Learning Phrase Representation PyTorch Lab – Replicating Cho et al., 2014
- 9 – 1:23:45 Seq2Seq Learning with Neural Networks
- 10 – 1:45:06 Seq2Seq PyTorch Lab – Replicating Sutskever et al., 2014
- 11 – 2:01:45 NMT by Jointly Learning to Align Bahdanau et al., 2015
- 12 – 2:32:36 NMT by Jointly Learning to Align & Translate PyTorch Lab – Replicating Bahdanau et al., 2015
- 13 – 2:42:45 On Using Very Large Target Vocabulary
- 14 – 3:03:45 Large Vocabulary NMT PyTorch Lab – Replicating Jean et al., 2015
- 15 – 3:24:56 Effective Approaches to Attention Luong et al., 2015
- 16 – 3:44:06 Attention Approaches PyTorch Lab – Replicating Luong et al., 2015
- 17 – 4:03:17 Long Short-Term Memory Network Deep Explanation
- 18 – 4:28:13 Attention Is All You Need Vaswani et al., 2017
- 19 – 4:47:46 Google Neural Machine Translation System GNMT – Wu et al., 2016
- 20 – 5:12:38 GNMT PyTorch Lab – Replicating Wu et al., 2016
- 21 – 5:29:46 Google’s Multilingual NMT Johnson et al., 2017
- 22 – 6:00:46 Multilingual NMT PyTorch Lab – Replicating Johnson et al., 2017
- 23 – 6:15:49 Transformer vs GPT vs BERT Architectures
- 24 – 6:36:38 Transformer Playground Tool Demo
- 25 – 6:38:31 Seq2Seq Idea from Google Translate Tool
- 26 – 6:49:31 RNN, LSTM, GRU Architectures Comparisons
- 27 – 7:01:08 LSTM & GRU Equations