Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

LLM Observability Panel - Monitoring and Evaluation in Production AI Systems

MLOps World: Machine Learning in Production via YouTube

Overview

Coursera Flash Sale
40% Off Coursera Plus for 3 Months!
Grab it
Join a comprehensive panel discussion featuring industry experts from Galileo, DraftKings, Target Corporation, and PIMCO as they explore the critical landscape of LLM observability in production environments. Learn from Atin Sanyal (Founder & CTO at Galileo), Naresh Kumar Batthula (Principal Developer Advocate at DraftKings), Bali Varadarajan (Lead AI Engineer – Digital Personalization at Target Corporation), and Anupama Garani (AI & ML Systems at PIMCO) as they share practical insights on monitoring, evaluation, and maintaining reliability in large language model systems. Discover key considerations for implementing effective observability frameworks in LLM deployments, gain valuable lessons from leading enterprise AI teams across diverse industries, and understand how monitoring and evaluation approaches are evolving to meet the unique challenges of generative AI systems. Explore real-world strategies for ensuring production AI system reliability, learn about emerging best practices in model monitoring, and understand the infrastructure requirements for maintaining robust LLM operations at scale.

Syllabus

Observability Panel | Galileo, DraftKings, Target Corporation, PIMCO | MLOps World 2025

Taught by

MLOps World: Machine Learning in Production

Reviews

Start your review of LLM Observability Panel - Monitoring and Evaluation in Production AI Systems

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.