- Explore Azure Databricks
In this module, you learn how to:
- Provision an Azure Databricks workspace
- Identify core workloads for Azure Databricks
- Use Data Governance tools Unity Catalog and Microsoft Purview
- Describe key concepts of an Azure Databricks solution
- Learn about Azure Databricks architecture, including the account hierarchy, control and compute planes, and different storage options for managing data with Unity Catalog.
By the end of this module, you'll be able to:
- Describe the Azure Databricks account hierarchy and how it organizes resources
- Explain the difference between control plane and compute plane
- Understand the role of workspace storage in Azure Databricks
- Describe default storage capabilities in serverless workspaces
- Explain how external locations connect cloud storage to Unity Catalog
- Understand how Unity Catalog managed storage organizes data across catalogs and schemas
- Learn how Azure Databricks integrates with Microsoft tools and services including Fabric, Power BI, VS Code, Power Platform, Copilot Studio, Purview, and Foundry to enable comprehensive data and AI solutions.
By the end of this module, you'll be able to:
- Understand how Azure Databricks integrates with Microsoft Fabric for bidirectional data access
- Describe the integration capabilities between Power BI and Azure Databricks
- Explain how the Databricks extension for VS Code enables local development with remote execution
- Understand how Power Platform can work with Azure Databricks data
- Describe how AI agents in Copilot Studio can access Databricks data
- Understand data governance integration with Microsoft Purview
- Explain how Microsoft Foundry agents connect with Databricks Genie spaces
- Learn how to select and configure appropriate compute resources in Azure Databricks including serverless, classic compute, SQL warehouses, and job clusters. Master performance tuning, access control, and library management.
By the end of this module, you'll be able to:
- Choose appropriate compute types for different Azure Databricks workloads
- Configure compute performance settings including node types, autoscaling, and termination
- Enable compute features like Photon acceleration and select appropriate Databricks Runtime versions
- Configure compute access permissions and dedicated group access modes
- Install and manage libraries on compute resources using various methods
- Learn how to create and organize catalogs, schemas, tables, views, and volumes in Unity Catalog to build a comprehensive data governance framework with proper naming conventions and AI/BI integration.
By the end of this module, you'll be able to:
- Apply effective naming conventions for Unity Catalog objects
- Create catalogs and schemas to organize data assets across environments
- Create managed and external tables, views, and volumes in Unity Catalog
- Implement DDL operations including functions and stored procedures
- Implement foreign catalogs to access external database systems
- Configure AI/BI Genie instructions for natural language data discovery
Overview
Coursera Flash Sale
40% Off Coursera Plus for 3 Months!
Grab it
Syllabus
- Explore Azure Databricks
- Introduction
- Get started with Azure Databricks
- Identify Azure Databricks workloads
- Understand key concepts
- Data governance using Unity Catalog and Microsoft Purview
- Exercise - Explore Azure Databricks
- Module assessment
- Summary
- Understand Azure Databricks architecture
- Introduction
- Understand Azure Databricks architecture
- Understand Unity Catalog managed storage
- Understand external storage
- Understand default storage (serverless compute)
- Module assessment
- Summary
- Understand Azure Databricks Integrations
- Introduction
- Understand integration with Microsoft Fabric
- Understand integration with Power BI
- Understand integration with VS Code
- Understand integration with Power Platform
- Understand integration with Copilot Studio
- Understand integration with Microsoft Purview
- Understand integration with Microsoft Foundry
- Module assessment
- Summary
- Select and Configure Compute in Azure Databricks
- Introduction
- Choose an appropriate compute type
- Configure compute performance
- Configure compute features
- Install libraries for compute
- Configure compute access
- Exercise - Select and Configure Compute in Azure Databricks
- Module assessment
- Summary
- Create and organize objects in Unity Catalog
- Introduction
- Apply naming conventions
- Create catalog
- Create schema
- Create tables and views
- Create volumes
- Implement DDL operations
- Implement foreign catalog
- Configure AI/BI Genie instructions
- Exercise - Create and Organize Objects in Unity Catalog
- Knowledge check
- Summary