Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

Word Embeddings - From PPMI to Contextual Representations - L3

UofU Data Science via YouTube

Overview

Coursera Flash Sale
40% Off Coursera Plus for 3 Months!
Grab it
Learn how words are represented as high-dimensional vectors in this comprehensive lecture covering the evolution from PPMI representations through word2Vec and GloVe to modern contextual embeddings like BERT. Explore the fundamental concepts of word embeddings and understand how different approaches to defining context shape representational hypotheses in natural language processing. Examine the progression of embedding techniques and discover how contextual information is encoded to create meaningful vector representations of words that capture semantic relationships and linguistic patterns.

Syllabus

L3 - Word Embeddings

Taught by

UofU Data Science

Reviews

Start your review of Word Embeddings - From PPMI to Contextual Representations - L3

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.