Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

Neural Networks for Natural Language Processing 2017

Graham Neubig via YouTube

Overview

Coursera Flash Sale
40% Off Coursera Plus for 3 Months!
Grab it
Explore neural network applications in natural language processing through this comprehensive lecture series from Carnegie Mellon University. Begin with foundational concepts including class introduction and the rationale behind neural networks, then progress through word prediction exercises and various word modeling techniques. Master efficiency optimization methods like word2vec acceleration tricks before diving into convolutional networks for text processing and recurrent neural networks. Learn to effectively use and evaluate sentence representations, understand conditioned generation mechanisms, and implement attention mechanisms for improved model performance. Advance through structured prediction techniques, including local dependency handling and transition-based dependency parsing, while exploring dynamic programming approaches to parsing. Delve into neural semantic parsing, latent variable models, and reinforcement learning applications in NLP contexts. Examine adversarial learning techniques and unsupervised structure learning methods, then explore document-level modeling approaches and dialog system architectures. Investigate knowledge base integration, machine reading comprehension with neural networks, and multilingual/multitask learning strategies. Conclude with practical debugging techniques for neural NLP systems and advanced search algorithms to optimize model performance across various natural language processing tasks.

Syllabus

CMU Neural Nets for NLP 2017 (1): Class Introduction & Why Neural Nets?
CMU Neural Nets for NLP 2017 (2): A Simple (?) Exercise: Predicting the Next Word in a Sentence
CMU Neural Nets for NLP 2017 (3): Models of Words
CMU Neural Nets for NLP 2017 (4): Why is word2vec so fast? Efficiency Tricks.
CMU Neural Nets for NLP 2017 (5): Convolutional Networks for Text
CMU Neural Nets for NLP 2017 (6): Recurrent Neural Networks
CMU Neural Nets for NLP 2017 (7): Using/Evaluating Sentence Representations
CMU Neural Nets for NLP 2017 (8): Conditioned Generation
CMU Neural Nets for NLP 2017 (9): Attention
CMU Neural Nets for NLP 2017 (10): Structured Prediction
CMU Neural Nets for NLP 2017 (11): Structured Prediction w/ Local Dependence
CMU Neural Nets for NLP 2017 (12): Transition-based Dependency Parsing
CMU Neural Nets for NLP 2017 (13): Parsing With Dynamic Programs
CMU Neural Nets for NLP 2017 (14): Neural Semantic Parsing
CMU Neural Nets for NLP 2017 (15): Latent Variable Models
CMU Neural Nets for NLP 2017 (16): Reinforcement Learning
CMU Neural Nets for NLP 2017 (17): Adversarial Learning
CMU Neural Nets for NLP 2017 (18): Unsupervised Learning of Structure
CMU Neural Nets for NLP 2017 (19): Document Level Models
CMU Neural Nets for NLP 2017 (20): Models of Dialog
CMU Neural Nets for NLP 2017 (21): Learning From/For Knowledge Bases
CMU Neural Nets for NLP 2017 (22): Machine Reading w/ Neural Nets
CMU Neural Nets for NLP 2017 (25): Multilingual and Multitask Learning
CMU Neural Nets for NLP 2017 (23): Debugging Neural Nets for NLP
CMU Neural Nets for NLP 2017 (24): Advanced Search Algorithms

Taught by

Graham Neubig

Reviews

Start your review of Neural Networks for Natural Language Processing 2017

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.