Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

Generating Zero-Shot Hard-Case Hallucinations - A Synthetic and Open Data Approach

Databricks via YouTube

Overview

Coursera Flash Sale
40% Off Coursera Plus for 3 Months!
Grab it
Learn about a novel framework for designing and inducing controlled hallucinations in long-form content generation by large language models across diverse domains in this 21-minute conference talk. Discover how to create fully-synthetic benchmarks and mine hard cases for iterative refinement of zero-shot hallucination detectors through a systematic approach. Explore the use of Gretel Data Designer (now part of NVIDIA) for designing realistic, high-quality long-context datasets across various domains, and understand the reasoning-based methodology for hard-case mining that relies on chain-of-thought-based generation of both faithful and deceptive question-answer pairs. Examine how consensus labeling and detector frameworks filter synthetic examples to identify zero-shot hard cases, resulting in a fully-automated system operating under open data licenses like Apache-2.0 for generating hallucinations at the edge-of-capabilities for target LLMs to detect. Gain insights from Eric Tramel, Principal Research Scientist at NVIDIA, on this cutting-edge approach to improving AI system reliability and detection capabilities.

Syllabus

Generating Zero-Shot Hard-Case Hallucinations: A Synthetic and Open Data Approach

Taught by

Databricks

Reviews

Start your review of Generating Zero-Shot Hard-Case Hallucinations - A Synthetic and Open Data Approach

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.