Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

Fine-Tuning Text Embeddings for Domain-specific Search with Python

Shaw Talebi via YouTube

Overview

Coursera Flash Sale
40% Off Coursera Plus for 3 Months!
Grab it
Learn how to fine-tune text embedding models for domain-specific search applications in this technical Python tutorial. Walk through a complete process of adapting embedding models using the Sentence Transformers library, with a practical focus on AI job search use cases. Master the five key steps of embedding fine-tuning: gathering positive and negative training pairs, selecting appropriate pre-trained models, choosing optimal loss functions, executing the fine-tuning process, and evaluating model performance. Explore concepts like Retrieval Augmented Generation (RAG), understand common vector search challenges, and gain hands-on experience with real-world datasets and model implementations. Access comprehensive resources including GitHub code, Hugging Face models and datasets, and detailed documentation references to support the learning process.

Syllabus

Intro -
RAG -
Problem with Vector Search -
Fine-tuning -
Why fine-tune? -
5 Steps for Fine-tuning Embeddings -
Example: Fine-tuning Embeddings on AI Jobs -
Step 1: Gather Positive and Negative Pairs -
Step 2: Pick a Pre-trained Model -
Step 3: Pick a Loss Function -
Step 4: Fine-tune the Model -
Step 5: Evaluate the Model -
What's Next? -

Taught by

Shaw Talebi

Reviews

Start your review of Fine-Tuning Text Embeddings for Domain-specific Search with Python

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.