Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

Microsoft

Building Data Lakes and Lakehouses with Microsoft Fabric

Microsoft via Coursera

Overview

Coursera Flash Sale
40% Off Coursera Plus for 3 Months!
Grab it
In this course, you will focus on the practical implementation of a Lakehouse architecture using Microsoft Fabric. You will design and build a Lakehouse with Delta tables, implement medallion architecture patterns, and develop data ingestion strategies. The course covers ACID principles, query endpoints, and security controls, providing you with hands-on experience in creating a complete Lakehouse environment with appropriate security. By the end of the course, you will be able to implement a functional and secure enterprise Lakehouse solution, with a focus on data governance and mastery of the Microsoft Fabric platform.

Syllabus

  • Fabric Lakehouse fundamentals
    • Establish a solid foundation in the Microsoft Fabric Lakehouse—a powerful solution that combines the best of data lakes and data warehouses. You'll get practical experience by creating and configuring your own Lakehouse environment. You'll work with Delta tables to ensure your data is processed reliably and learn how your Lakehouse seamlessly integrates with OneLake storage.Through guided lab exercises and interactive practice, you'll learn effective strategies for managing large volumes of data and gain the skills needed to implement a scalable, enterprise-grade Lakehouse.
  • Delta Lake and ACID principles
    • Master the transactional capabilities of Delta Lake, which are essential for ensuring data reliability in your Fabric Lakehouse. You will get a practical understanding of how ACID (Atomicity, Consistency, Isolation, and Durability) principles are implemented in Delta tables. Through hands-on exercises, you will learn to query historical data versions, recover data from errors or failures, and manage schema evolution as your data requirements change. This module will equip you with technical skills to build robust, enterprise-grade data systems that maintain integrity even with multiple, simultaneous operations.
  • Medallion architecture implementation
    • Design and implement a multi-layered data organization strategy using the medallion architecture pattern. You'll work with three distinct data layers: Bronze Layer (store raw, unchanged data), Silver Layer (cleanse and transform data, applying data quality controls), and Gold Layer (business-ready, curated datasets optimized for analytics). Through hands-on exercises, you'll create each layer, develop data quality controls and transformations, and build datasets that are ready for business analysis. By the end of this module, you'll be able to implement a full medallion architecture that provides a balance between raw data accessibility and quality control.
  • Data ingestion strategies
    • Develop comprehensive data ingestion capabilities for Fabric Lakehouse using the right tool for each scenario through hands-on implementation of Dataflows Gen2 for easy in-browser table loads, Pipelines for orchestrating and scheduling copy/ELT operations, Notebooks for custom code and transformations, and EventStreams for real-time data processing. This practical exercise guides learners through landing data as Delta in OneLake, mapping to Lakehouse tables, setting refresh schedules, and selecting appropriate ingestion approaches for different scenarios within the Fabric web experience. Learners will demonstrate mastery of versatile data ingestion by successfully implementing multiple ingestion methods and articulating scenario-based tool selection decisions that optimize data loading efficiency and reliability across diverse data ingestion requirements.
  • Query endpoints and security
    • Configure the access layers of your Fabric Lakehouse to make your data available to users while keeping it secure. You will get hands-on experience by creating a complete, end-to-end solution. This includes setting up SQL Analytics endpoints (enabling efficient data querying), connecting to Power BI (visualization and reporting), implementing fine-grained permissions (precise access control), and integrating with Microsoft Entra ID (secure, enterprise-grade authentication). By the end of this module, you'll be able to create an enterprise-ready Lakehouse environment that enables self-service analytics without compromising data security.

Taught by

Microsoft

Reviews

Start your review of Building Data Lakes and Lakehouses with Microsoft Fabric

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.