The Most Addictive Python and SQL Courses
AI Adoption - Drive Business Value and Organizational Impact
Overview
Coursera Flash Sale
40% Off Coursera Plus for 3 Months!
Grab it
Learn to implement comprehensive observability strategies for serverless generative AI applications built on Amazon Bedrock in this conference talk from Conf42 Observability 2025. Explore the fundamental challenges of monitoring probabilistic AI systems compared to traditional deterministic applications, understanding why conventional monitoring approaches fall short when dealing with generative AI models. Discover the essential building blocks required for creating robust generative AI applications, including proper architecture patterns and design considerations specific to serverless environments. Master key objectives for AI application monitoring, focusing on performance metrics, cost optimization, and reliability measures that matter most in production AI systems. Examine practical implementation techniques for integrating observability tools and frameworks into your Amazon Bedrock applications, with detailed coverage of logging, tracing, and monitoring strategies. Follow along with a hands-on demonstration showing real-world implementation of observability patterns, including code examples and configuration details for monitoring AI model performance, token usage, and response quality. Gain insights into measuring and optimizing the unique characteristics of generative AI workloads, such as latency variability, output quality assessment, and resource consumption patterns that differ significantly from traditional web applications.
Syllabus
00:00 Introduction and Speaker Introduction
01:06 Challenges with Generative AI Models
02:43 Deterministic vs. Probabilistic Systems
07:39 Key Objectives for AI Applications
10:10 Building Blocks of Generative AI Applications
19:39 Implementing Observability in AI Applications
22:40 Demo and Practical Implementation
27:45 Conclusion and Closing Remarks
Taught by
Conf42