Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

NLP for Semantic Search Course

James Briggs via YouTube

Overview

Coursera Flash Sale
40% Off Coursera Plus for 3 Months!
Grab it
Learn natural language processing techniques for building semantic search systems and working with vector databases through this comprehensive 9-hour course. Master dense vector representations for NLP and computer vision applications, then dive deep into sentence embeddings using transformer models. Explore multiple fine-tuning approaches including the original NLI Softmax loss method and the high-performance Multiple Negatives Ranking technique. Discover how to work with multilingual sentence vectors that support over 50 languages using a single model, and understand unsupervised sentence transformer training through TSDAE (Transformer-based Sequential Denoising Auto-Encoder). Examine evaluation measures for search and recommender systems, then advance to data augmentation techniques with Augmented SBERT and domain transfer methods using AugSBERT. Build practical question-answering systems by implementing both extractive and abstractive QA approaches, including complete Python implementations for open-domain question-answering and reader models. Learn advanced training techniques such as query generation (GenQ) for sentence transformers and explore cutting-edge methods like Generative Pseudo-Labeling (GPL) that may represent the future of sentence transformer development.

Syllabus

Intro to Dense Vectors for NLP and Vision
Intro to Sentence Embeddings with Transformers
Fine-tune Sentence Transformers the OG Way (with NLI Softmax loss)
Fine-tune High Performance Sentence Transformers (with Multiple Negatives Ranking)
All You Need to Know on Multilingual Sentence Vectors (1 Model, 50+ Languages)
Today Unsupervised Sentence Transformers, Tomorrow Skynet (how TSDAE works)
Evaluation Measures for Search and Recommender Systems
Making The Most of Data: Augmented SBERT
AugSBERT: Domain Transfer for Sentence Transformers
Question-Answering in NLP (Extractive QA and Abstractive QA)
How to build a Q&A AI in Python (Open-domain Question-Answering)
How to build a Q&A Reader Model in Python (Open-domain QA)
Train Sentence Transformers by Generating Queries (GenQ)
Is GPL the Future of Sentence Transformers? | Generative Pseudo-Labeling Deep Dive

Taught by

James Briggs

Reviews

Start your review of NLP for Semantic Search Course

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.