Google, IBM & Microsoft Certificates — All in One Plan
Launch Your Cybersecurity Career in 6 Months
Overview
Google, IBM & Meta Certificates — All 10,000+ Courses at 40% Off
One annual plan covers every course and certificate on Coursera. 40% off for a limited time.
Get Full Access
Explore a comprehensive technical video that delves into RAGAS, a groundbreaking framework for evaluating Retrieval-Augmented Generation (RAG) systems. Learn about the challenges of RAG evaluation and discover how RAGAS addresses these through three key criteria: faithfulness, answer relevance, and context relevance. Understand the fundamentals of RAG systems, examine practical examples using the arXiv Dive Bot, and investigate the WikiEval dataset. Master the intricacies of each evaluation criterion while gaining hands-on experience with Oxen AI's dataset versioning capabilities. Connect with the community through Discord, access supplementary materials including the original research paper, and leverage provided datasets to enhance your understanding of RAG evaluation methodologies.
Syllabus
Intro to the arXiv Dive
What is RAGAS?
Quick Recap on What RAG is
Intro to the arXiv Dive Bot
Why Evaluating RAG is Hard
Intro to the Example Dataset
Enter RAGAS
The First Criteria: Faithfulness
The Second Criteria: Answer Relevance
The WikiEval Dataset
The Third Criteria: Context Relevance
Conclusion
Taught by
Oxen