Save 40% on 12 months of Coursera Plus
AI Engineer - Learn how to integrate AI into software applications
Overview
Coursera Spring Sale
40% Off Coursera Plus Annual!
Grab it
Explore a 44-minute conference talk from SNIA SDC 2024 examining how flash storage technology is revolutionizing data ingestion for AI workloads. Dive into the challenges faced by traditional object stores like S3, Google Cloud, and Azure Blob as AI deployments scale to production levels, using Meta's Tectonic-Shift platform as a case study. Learn about the increasing demands of Deep Learning Recommendation Model (DLRM) training and how flash storage addresses bandwidth and power requirements. Examine key findings from MLPerf DLRM preprocessing and training storage trace analysis, while understanding the critical need for standardized benchmarks in data ingestion performance and power efficiency measurement. Gain insights from Micron Technology experts on the evolving landscape of AI deployment, object store requirements, and the strategic role of flash storage in meeting these emerging challenges.
Syllabus
SNIA SDC 2024 - The Role of Flash in Data Ingestion within the AI Pipeline
Taught by
SNIAVideo