Overview
Coursera Flash Sale
40% Off Coursera Plus for 3 Months!
Grab it
Explore a critical vulnerability in Retrieval-Augmented Generation (RAG) systems through this 14-minute video that reveals how LLMs struggle with multi-step reasoning when faced with conflicting knowledge. Discover the hidden dangers that most developers currently overlook in RAG implementations and understand the limitations of knowledge propagation in large language models. Learn about the research findings from EPFL and Stony Brook University that demonstrate how RAG systems can collapse when processing contradictory information, potentially leading to flawed reasoning and unreliable outputs. Gain insights into the technical challenges of AI reasoning systems and understand why this issue represents a significant concern for developers working with retrieval-augmented generation technologies.
Syllabus
RAG Collapses: Reasoning w/ Conflicting Knowledge
Taught by
Discover AI