Lakeflow Declarative Pipelines Integrations and Interoperability - Get Data From and to Anywhere
Databricks via YouTube
Learn Python with Generative AI - Self Paced Online
Start speaking a new language. It’s just 3 weeks away.
Overview
Google, IBM & Meta Certificates — All 10,000+ Courses at 40% Off
One annual plan covers every course and certificate on Coursera. 40% off for a limited time.
Get Full Access
Learn how to integrate Delta Live Tables (DLT) with external systems to ingest and send data across diverse platforms in this 39-minute conference talk. Discover new DLT capabilities including the DLT Sinks API and enhanced support for Python Data Source and ForEachBatch functionality that enable integration with virtually any system. Explore practical implementations for popular Apache Spark integrations such as JDBC, Kafka, external and managed Delta tables, Azure CosmosDB, and MongoDB. Understand how DLT extends beyond traditional ingestion and ETL into the Lakehouse to support comprehensive data pipeline interoperability. Gain insights into production-ready approaches for connecting your data engineering workflows with external systems and databases. Master techniques for building declarative pipelines that can seamlessly move data between different platforms and services within your data architecture.
Syllabus
Lakeflow Declarative Pipelines Integrations and Interoperability: Get Data From — and to — Anywhere
Taught by
Databricks