Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

University of Colorado Boulder

Deep Learning for Natural Language Processing

University of Colorado Boulder via Coursera

Overview

Coursera Flash Sale
40% Off Coursera Plus for 3 Months!
Grab it
Deep learning has revolutionized the field of natural language processing and led to many state-of-the-art results. This course introduces students to neural network models and training algorithms frequently used in natural language processing. At the end of this course, learners will be able to explain and implement feedforward networks, recurrent neural networks, and transformers. They will also have an understanding of transfer learning and the inner workings of large language models. This course can be taken for academic credit as part of CU Boulder’s MS in Data Science or MS in Computer Science degrees offered on the Coursera platform. These fully accredited graduate degrees offer targeted courses, short 8-week sessions, and pay-as-you-go tuition. Admission is based on performance in three preliminary courses, not academic history. CU degrees on Coursera are ideal for recent graduates or working professionals. Learn more: MS in Data Science: https://www.coursera.org/degrees/master-of-science-data-science-boulder MS in Computer Science: https://coursera.org/degrees/ms-computer-science-boulder

Syllabus

  • Feedforward Neural Nets and Recurrent Neural Networks
    • This first week introduces the fundamental concepts of feedforward and recurrent neural networks (RNNs), focusing on their architectures, mathematical foundations, and applications in natural language processing (NLP). We'll will begin with an exploration of feedforward networks and their role in sentence embeddings and sentiment analysis. We then progresses to RNNs, covering sequence modeling techniques such as LSTMs, GRUs, and bidirectional RNNs, along with their implementation in Python. Finally, you will examine training techniques, gaining hands-on experience in optimizing neural language models.
  • Sequence to Sequence Models, Attention, Transformers
    • This week we'll explore sequence-to-sequence models in natural language processing (NLP), beginning with recurrent neural network (RNN)-based architectures and the introduction of attention mechanisms for improved alignment in tasks like machine translation. The module also covers best practices for training neural networks, including regularization, optimization strategies, and efficient model training. At the end of the week, you will gain practical experience in implementing and training sequence-to-sequence models.
  • Transfer Learning
    • This week explores transfer learning techniques in NLP, focusing on pretraining, finetuning, and multilingual models. You will first examine the role of pretrained language models like GPT, GPT-2, and BERT, and their challenges. We then explore multitask training and data augmentation, highlighting strategies like parameter sharing and loss weighting to improve model generalization across tasks. Finally, you will dive into crosslingual transfer learning, exploring methods like translate-train vs. translate-test, as well as zero-shot, one-shot, and few-shot learning for multilingual NLP.
  • Large Language Models
    • This final week introduces large language models (LLMs) and how they can be effectively used through techniques like prompt engineering, in-context learning, and parameter-efficient finetuning. You will explore language-and-vision models, understanding how multimodal architectures extend beyond text to integrate visual and other data modalities. We will also examine non-functional properties of LLMs, including challenges such as hallucinations, fairness, resource efficiency, privacy, and interpretability.

Taught by

Katharina von der Wense

Reviews

Start your review of Deep Learning for Natural Language Processing

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.