Lakeflow Declarative Pipelines Integrations and Interoperability - Get Data From and to Anywhere
Databricks via YouTube
PowerBI Data Analyst - Create visualizations and dashboards from scratch
The Most Addictive Python and SQL Courses
Overview
Coursera Flash Sale
40% Off Coursera Plus for 3 Months!
Grab it
Learn how to integrate Delta Live Tables (DLT) with external systems to ingest and send data across diverse platforms in this 39-minute conference talk. Discover new DLT capabilities including the DLT Sinks API and enhanced support for Python Data Source and ForEachBatch functionality that enable integration with virtually any system. Explore practical implementations for popular Apache Spark integrations such as JDBC, Kafka, external and managed Delta tables, Azure CosmosDB, and MongoDB. Understand how DLT extends beyond traditional ingestion and ETL into the Lakehouse to support comprehensive data pipeline interoperability. Gain insights into production-ready approaches for connecting your data engineering workflows with external systems and databases. Master techniques for building declarative pipelines that can seamlessly move data between different platforms and services within your data architecture.
Syllabus
Lakeflow Declarative Pipelines Integrations and Interoperability: Get Data From — and to — Anywhere
Taught by
Databricks