Simplifying Data Pipelines With Lakeflow Declarative Pipelines - A Beginner's Guide
Databricks via YouTube
Overview
Coursera Flash Sale
40% Off Coursera Plus for 3 Months!
Grab it
Learn how to build and manage reliable data pipelines using Databricks' Lakeflow Declarative Pipelines (DLT) in this 39-minute conference talk from the Data + AI Summit. Discover how DLT revolutionizes data engineering by unifying batch and streaming processing while reducing operational complexity and ensuring dependable data delivery at scale. Explore the brand new pipeline editor that simplifies data transformations and understand how serverless compute modes can be optimized for either performance or cost efficiency. Master declarative change data capture techniques and learn to build efficient SQL-based pipelines for both batch ETL and real-time processing workloads. Gain insights into full Unity Catalog integration for enhanced governance and lineage tracking, and understand how to read and write data with Kafka and custom sources. Examine monitoring and observability features essential for operational excellence, and discover the new "Real-time Mode" designed for ultra-low-latency streaming applications. The session demonstrates how DLT powers better analytics and AI initiatives through reliable, unified pipeline architecture, making it an essential resource for data engineers looking to modernize their data processing workflows.
Syllabus
Simplifying Data Pipelines With Lakeflow Declarative Pipelines: A Beginner’s Guide
Taught by
Databricks