Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

Word Embeddings - Word2Vec Skip-Gram

UofU Data Science via YouTube

Overview

Coursera Flash Sale
40% Off Coursera Plus for 3 Months!
Grab it
Learn the fundamentals of word embeddings through an in-depth exploration of the word2vec skip-gram model in this comprehensive lecture from UofU Data Science. Discover how words can be represented as dense vectors in high-dimensional space, enabling machines to understand semantic relationships and similarities between words. Explore the skip-gram architecture, which predicts surrounding context words given a target word, and understand how this approach captures meaningful linguistic patterns. Examine the mathematical foundations behind the model, including the objective function, negative sampling techniques, and hierarchical softmax optimization methods. Gain practical insights into training procedures, hyperparameter tuning, and implementation considerations for building effective word embeddings. Understand how skip-gram differs from other word2vec variants like CBOW (Continuous Bag of Words) and learn about the trade-offs between different approaches. Analyze real-world applications where word embeddings enhance natural language processing tasks such as sentiment analysis, machine translation, and information retrieval. Access accompanying slides to reinforce key concepts and visualize the embedding space through practical examples and case studies.

Syllabus

Word embeddings: word2vec skip-gram

Taught by

UofU Data Science

Reviews

Start your review of Word Embeddings - Word2Vec Skip-Gram

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.