Live Online Classes in Design, Coding & AI — Small Classes, Free Retakes
Python, Prompt Engineering, Data Science — Build the Skills Employers Want Now
Overview
Google, IBM & Meta Certificates — All 10,000+ Courses at 40% Off
One annual plan covers every course and certificate on Coursera. 40% off for a limited time.
Get Full Access
Learn how words are represented as high-dimensional vectors in this comprehensive lecture covering the evolution from PPMI representations through word2Vec and GloVe to modern contextual embeddings like BERT. Explore the fundamental concepts of word embeddings and understand how different approaches to defining context shape representational hypotheses in natural language processing. Examine the progression of embedding techniques and discover how contextual information is encoded to create meaningful vector representations of words that capture semantic relationships and linguistic patterns.
Syllabus
L3 - Word Embeddings
Taught by
UofU Data Science