Overview
Coursera Flash Sale
40% Off Coursera Plus for 3 Months!
Grab it
Learn to architect and build real-time data systems that power conversational AI applications with human-speed responsiveness in this 25-minute conference talk. Discover how to design "conversation-fast" backends that integrate streaming ingestion, real-time analytics, and orchestration for production-grade AI chat systems through a live demonstration and open-source reference architecture. Explore the integration of multiple technologies including Postgres for transactional data, Elasticsearch for retrieval, ClickHouse for analytics, Kafka-compatible platforms for streaming, and Temporal for orchestration, all connected through a React frontend with MCP interfaces that link LLMs directly to applications. Examine how MooseStack, an open-source toolkit, simplifies the integration of streaming and OLAP capabilities into TypeScript or Python applications, and see a cloud-deployed reference implementation showcasing collaboration between StreamNative and FiveOneFour's Boreal for production-ready scalability. Master low-latency system architecture techniques, real-time data patterns for chat and analytics integration, practical cloud-native deployment strategies, and accelerated AI application development methods using MooseStack, with concrete open-source patterns designed to make AI conversations feel natural at real-time speed.
Syllabus
[AI + Stream Processing] Building Real-Time Data Architectures for AI Chat
Taught by
StreamNative