Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

Neural Networks for Natural Language Processing 2021

Graham Neubig via YouTube

Overview

Coursera Flash Sale
40% Off Coursera Plus for 3 Months!
Grab it
Explore neural network architectures and techniques specifically designed for natural language processing through this comprehensive lecture series from Carnegie Mellon University. Begin with foundational concepts including language modeling, efficiency training tricks, and building neural network toolkits for NLP applications. Master core architectures like recurrent neural networks, attention mechanisms, and sequence-to-sequence models while learning practical implementation skills through the minnn toolkit. Delve into advanced topics including distributional semantics, word vectors, sentence representations, and contextual embeddings. Study structured prediction methods, model interpretation techniques, and debugging strategies for neural NLP systems. Advance to cutting-edge areas such as tree and graph generation, reinforcement learning for structured prediction, and sequence-to-sequence pre-training approaches. Examine specialized applications including machine reading comprehension, knowledge base integration, and advanced search algorithms. Explore contemporary challenges in the field through coverage of adversarial methods, latent variable models, multilingual learning approaches, bias detection and mitigation, and document-level modeling techniques. Gain both theoretical understanding and practical experience in implementing state-of-the-art neural network solutions for complex natural language processing tasks.

Syllabus

CMU Neural Nets for NLP 2021 (1): Introduction
CMU Neural Nets for NLP 2021 (2): Language Modeling, Efficiency/Training Tricks
CMU Neural Nets for NLP 2021 (3): Building A Neural Network Toolkit for NLP, minnn
CMU Neural Nets for NLP 2021 (4): Efficiency Tricks for Neural Nets
CMU Neural Nets for NLP 2021 (5): Recurrent Neural Networks
CMU Neural Nets for NLP 2021 (6): Conditioned Generation
CMU Neural Nets for NLP 2021 (7): Attention
CMU Neural Nets for NLP 2021 (8): Distributional Semantics and Word Vectors
CMU Neural Nets for NLP 2021 (9): Sentence and Contextual Word Representations
CMU Neural Nets for NLP 2021 (11): Structured Prediction with Local Independence Assumptions
CMU Neural Nets for NLP 2021 (10): Debugging Neural Nets (for NLP)
CMU Neural Nets for NLP 2021 (12): Model Interpretation
CMU Neural Nets for NLP 2021 (13): Generating Trees and Graphs
CMU Neural Nets for NLP 2021 (14): Margin-based and Reinforcement Learning for Structured Prediction
CMU Neural Nets for NLP 2021 (15): Sequence-to-sequence Pre-training
CMU Neural Nets for NLP 2021 (16): Machine Reading w/ Neural Nets
CMU Neural Nets for NLP 2021 (17): Neural Nets + Knowledge Bases
CMU Neural Nets for NLP 2021 (18): Advanced Search Algorithms
CMU Neural Nets for NLP 2021 (19): Adversarial Methods
CMU Neural Nets for NLP 2021 (20): Models w/ Latent Random Variables
CMU Neural Nets for NLP 2021 (21): Multilingual Learning
CMU Neural Nets for NLP 2021 (22): Bias in NLP
CMU Neural Nets for NLP 2021 (23): Document-level Models

Taught by

Graham Neubig

Reviews

Start your review of Neural Networks for Natural Language Processing 2021

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.