Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

CNCF [Cloud Native Computing Foundation]

This Lying Has To Stop - Keeping AI Honest with OpenTelemetry

CNCF [Cloud Native Computing Foundation] via YouTube

Overview

Coursera Flash Sale
40% Off Coursera Plus for 3 Months!
Grab it
Learn how to detect and prevent AI hallucinations in automated systems using OpenTelemetry observability tools in this 25-minute conference talk from CNCF. Explore the real-world challenges of building an AI-augmented engineering journal called Commit Story, where AI assistants confidently provide false information that can derail debugging efforts. Discover how OpenTelemetry traces can reveal the truth behind AI-generated responses by providing visibility into commit-triggered AI calls and response patterns. Follow along with live demonstrations of authentic debugging disasters to understand how AI deception occurs and learn practical techniques for verifying AI response trustworthiness. Gain insights into implementing effective guardrails that help prevent AI hallucinations before they impact your development workflow, ensuring your automated documentation systems remain reliable and accurate.

Syllabus

This Lying Has To Stop: Keeping AI Honest with OpenTelemetry - Whitney Lee, Datadog

Taught by

CNCF [Cloud Native Computing Foundation]

Reviews

Start your review of This Lying Has To Stop - Keeping AI Honest with OpenTelemetry

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.