Highly Scalable AI Search Engine and AI Data Lake With Kubernetes and LanceDB
CNCF [Cloud Native Computing Foundation] via YouTube
Build GenAI Apps from Scratch — UCSB PaCE Certificate Program
Launch a New Career with Certificates from Google, IBM & Microsoft
Overview
Google, IBM & Meta Certificates — All 10,000+ Courses at 40% Off
One annual plan covers every course and certificate on Coursera. 40% off for a limited time.
Get Full Access
Learn how to build highly scalable AI search engines and data lakes using Kubernetes and LanceDB in this conference talk that addresses the complex retrieval challenges facing modern AI applications. Discover how AI workflows require not just vector database capabilities, but also feature store retrieval and analytical queries, which traditionally forces organizations to store AI data in separate silos using multiple systems, increasing both cost and complexity. Explore LanceDB's unified approach that combines vector search, feature retrieval, and SQL-based analytics within a single system built on the open-source Lance columnar format, which serves as the new standard for AI data storage. Understand how Kubernetes-native autoscaling enables RAG and AI Agent applications embedding LanceDB to scale dynamically, creating an architecture that redefines the performance-scale-cost curve and delivers hyper-scalable AI applications with 10X better cost-efficiency. Examine the integration of KServe for model serving, Postgres for metadata caching, and Kubernetes native data caching solutions that breaks the traditional impossible triangle of performance, scale, and cost in AI infrastructure.
Syllabus
Highly Scalable AI Search Engine and AI Data Lake With Kubernetes and Lance... Lu Qiu & Chanchan Mao
Taught by
CNCF [Cloud Native Computing Foundation]