Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

Neural Networks for Natural Language Processing 2020

Graham Neubig via YouTube

Overview

Coursera Flash Sale
40% Off Coursera Plus for 3 Months!
Grab it
Explore neural network architectures and techniques specifically designed for natural language processing through this comprehensive lecture series from Carnegie Mellon University. Master fundamental concepts starting with language modeling and training efficiency tricks, then progress through convolutional and recurrent neural networks for text processing. Delve into advanced topics including attention mechanisms, distributional semantics, word vectors, and contextual word representations. Learn practical skills for debugging neural networks and implementing structured prediction with various independence assumptions. Discover techniques for generating trees incrementally, search-based structured prediction, and minimum risk training with reinforcement learning. Examine cutting-edge approaches including adversarial methods, models with latent random variables, and unsupervised learning of structure. Advance to specialized applications covering multitask and multilingual learning, document-level modeling, and integration of neural networks with knowledge bases. Conclude with exploration of machine reading comprehension, natural language generation, and model interpretation techniques essential for modern NLP practitioners.

Syllabus

CMU Neural Nets for NLP 2020 (1): Introduction
CMU Neural Nets for NLP 2020 (2): Language Modeling, Efficiency/Training Tricks
CMU Neural Nets for NLP 2020 (3): Convolutional Neural Networks for Text
CMU Neural Nets for NLP 2020 (4): Recurrent Neural Networks
CMU Neural Nets for NLP 2020 (5): Efficiency Tricks for Neural Nets
CMU Neural Nets for NLP 2020 (7): Attention
CMU Neural Nets for NLP 2020 (8): Distributional Semantics and Word Vectors
CMU Neural Nets for NLP 2020 (9): Sentence and Contextual Word Representations
CMU Neural Nets for NLP 2020 (10): Debugging Neural Nets (for NLP)
CMU Neural Nets for NLP 2020 (11): Structured Prediction with Local Independence Assumptions
CMU Neural Nets for NLP 2020 (12): Generating Trees Incrementally
CMU Neural Nets for NLP 2020 (13): Generating Trees Incrementally
CMU Neural Nets for NLP 2020 (14): Search-based Structured Prediction
CMU Neural Nets for NLP 2020 (15): Minimum Risk Training and Reinforcement Learning
CMU Neural Nets for NLP 2020 (16): Advanced Search Algorithms
CMU Neural Nets for NLP 2020 (17): Adversarial Methods
CMU Neural Nets for NLP 2020 (18): Models w/ Latent Random Variables
CMU Neural Nets for NLP 2020 (19): Unsupervised and Semi-supervised Learning of Structure
CMU Neural Nets for NLP 2020 (20): Multitask and Multilingual Learning
CMU Neural Nets for NLP 2020 (21): Document Level Models
CMU Neural Nets for NLP 2020 (22): Neural Nets + Knowledge Bases
CMU Neural Nets for NLP 2020 (23): Machine Reading w/ Neural Nets
CMU Neural Nets for NLP 2020 (24): Natural Language Generation
CMU Neural Nets for NLP 2020 (25): Model Interpretation

Taught by

Graham Neubig

Reviews

Start your review of Neural Networks for Natural Language Processing 2020

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.