Skip to content

Commit 3c2bc88

Browse files
Merge pull request #52972 from MicrosoftDocs/NEW-DP-databricks-engineering
Databricks release -> main
2 parents 0908e90 + 27648cc commit 3c2bc88

469 files changed

Lines changed: 16264 additions & 2 deletions

File tree

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.
Lines changed: 62 additions & 0 deletions
Loading
Lines changed: 40 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,40 @@
1+
### YamlMime:LearningPath
2+
uid: learn.wwl.azure-databricks-data-engineer-deploy-maintain-data-pipelines-workloads
3+
metadata:
4+
title: Deploy and Maintain Data Pipelines and Workloads With Azure Databricks
5+
description: Master the complete lifecycle of building, deploying, and maintaining production-ready data pipelines in Azure Databricks—from design and orchestration to monitoring and optimization.
6+
ms.date: 12/09/2025
7+
author: weslbo
8+
ms.author: wedebols
9+
ms.topic: learning-path
10+
title: Deploy and maintain data pipelines and workloads with Azure Databricks
11+
prerequisites: |
12+
- Good understanding of Azure Databricks workspaces
13+
- Familiarity with data engineering concepts and SQL
14+
- Experience with Python programming and notebooks
15+
- Knowledge of Git version control fundamentals
16+
summary: |
17+
Master the complete lifecycle of building, deploying, and maintaining production-ready data pipelines in Azure Databricks—from design and orchestration to monitoring and optimization.
18+
19+
By the end of this learning path, you'll be able to:
20+
21+
- Design and implement robust data pipelines using notebooks and Lakeflow Declarative Pipelines
22+
- Create and orchestrate Lakeflow Jobs with triggers, schedules, and error handling
23+
- Apply version control and deploy pipelines across environments using Git and Databricks Asset Bundles
24+
- Monitor, troubleshoot, and optimize data workloads for reliability and performance
25+
iconUrl: /training/achievements/generic-trophy.svg
26+
levels:
27+
- intermediate
28+
roles:
29+
- data-engineer
30+
products:
31+
- azure-databricks
32+
subjects:
33+
- data-engineering
34+
modules:
35+
- learn.wwl.design-implement-data-pipelines
36+
- learn.wwl.implement-lakeflow-jobs
37+
- learn.wwl.implement-development-lifecycle-processes-azure-databricks
38+
- learn.wwl.monitor-troubleshoot-optimize-workloads-azure-databricks
39+
trophy:
40+
uid: learn.wwl.azure-databricks-data-engineer-deploy-maintain-data-pipelines-workloads.trophy
Lines changed: 34 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,34 @@
1+
### YamlMime:LearningPath
2+
uid: learn.wwl.azure-databricks-data-engineer-prepare-process-data
3+
metadata:
4+
title: Prepare and Process Data with Azure Databricks
5+
description: Master the essential skills to build robust, scalable data engineering solutions with Azure Databricks and Unity Catalog. Learn to design effective data models, ingest data from diverse sources, transform raw data into analytics-ready formats, and ensure data quality across your lakehouse architecture.
6+
ms.date: 12/09/2025
7+
author: weslbo
8+
ms.author: wedebols
9+
ms.topic: learning-path
10+
title: Prepare and process data with Azure Databricks
11+
prerequisites: |
12+
- Good understanding of Azure Databricks workspaces and Unity Catalog concepts
13+
- Familiarity with SQL and Python programming
14+
- Knowledge of fundamental data engineering and data warehouse concepts
15+
summary: |
16+
Master the essential skills to build robust, scalable data engineering solutions with Azure Databricks and Unity Catalog. Learn to design effective data models, ingest data from diverse sources, transform raw data into analytics-ready formats, and ensure data quality across your lakehouse architecture.
17+
18+
In this learning path, you'll learn how to build a data engineering workflow using Azure Databricks and Unity Catalog. Starting with foundational data modeling concepts, you'll design schemas and partitioning strategies optimized for analytical workloads. You'll then explore multiple ingestion patterns—from managed connectors to streaming pipelines—to bring data into your lakehouse. Next, you'll apply transformation techniques to cleanse and reshape data for business use. Finally, you'll implement quality controls to maintain data integrity throughout your pipelines. By the end, you'll have the practical skills to architect and build production-ready data solutions in Unity Catalog.
19+
iconUrl: /training/achievements/generic-trophy.svg
20+
levels:
21+
- intermediate
22+
roles:
23+
- data-engineer
24+
products:
25+
- azure-databricks
26+
subjects:
27+
- data-engineering
28+
modules:
29+
- learn.wwl.design-implement-data-modeling-unity-catalog
30+
- learn.wwl.ingest-data-into-unity-catalog
31+
- learn.wwl.cleanse-transform-load-data-into-unity-catalog
32+
- learn.wwl.implement-manage-data-quality-constraints-unity-catalog
33+
trophy:
34+
uid: learn.wwl.azure-databricks-data-engineer-prepare-process-data.trophy
Lines changed: 32 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,32 @@
1+
### YamlMime:LearningPath
2+
uid: learn.wwl.azure-databricks-data-engineer-secure-govern-unity-catalog
3+
metadata:
4+
title: Secure and Govern Unity Catalog Objects in Azure Databricks
5+
description: Learn to implement comprehensive security and governance for your data assets in Azure Databricks using Unity Catalog. Master access control strategies, fine-grained permissions, credential management, and governance practices to build a secure and compliant data platform.
6+
ms.date: 12/09/2025
7+
author: weslbo
8+
ms.author: wedebols
9+
ms.topic: learning-path
10+
title: Secure and govern Unity Catalog objects in Azure Databricks
11+
prerequisites: |
12+
- Good understanding of Azure Databricks workspaces and Unity Catalog concepts
13+
- Familiarity with SQL and data access patterns
14+
- Knowledge of Microsoft Entra ID and Azure security fundamentals
15+
summary: |
16+
Learn to implement comprehensive security and governance for your data assets in Azure Databricks using Unity Catalog. Master access control strategies, fine-grained permissions, credential management, and governance practices to build a secure and compliant data platform.
17+
18+
In this learning path, you'll learn how to secure and govern your data estate with Unity Catalog. You'll start by implementing robust security controls, including table and schema-level permissions, row filtering, and column masking. You'll discover how to manage credentials securely using Azure Key Vault, service principals, and managed identities. Then, you'll explore governance capabilities such as data lineage tracking, audit logging, and attribute-based access control. Finally, you'll learn to share data securely with external parties using Delta Sharing. By the end, you'll be equipped to build a secure, compliant, and well-governed data platform that meets enterprise requirements.
19+
iconUrl: /training/achievements/generic-trophy.svg
20+
levels:
21+
- intermediate
22+
roles:
23+
- data-engineer
24+
products:
25+
- azure-databricks
26+
subjects:
27+
- data-engineering
28+
modules:
29+
- learn.wwl.secure-unity-catalog-objects
30+
- learn.wwl.govern-unity-catalog-objects
31+
trophy:
32+
uid: learn.wwl.azure-databricks-data-engineer-secure-govern-unity-catalog.trophy
Lines changed: 35 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,35 @@
1+
### YamlMime:LearningPath
2+
uid: learn.wwl.azure-databricks-data-engineer-setup-configure-environment
3+
metadata:
4+
title: Set up and Configure an Azure Databricks Environment
5+
description: Build a solid foundation in Azure Databricks by understanding its architecture, integrations, compute options, and data organization capabilities. Learn how Azure Databricks provides a unified platform for data engineering, analytics, and AI workloads in the cloud.
6+
ms.date: 12/09/2025
7+
author: weslbo
8+
ms.author: wedebols
9+
ms.topic: learning-path
10+
title: Set up and configure an Azure Databricks environment
11+
prerequisites: |
12+
- Fundamental knowledge of data analytics concepts
13+
- Basic understanding of cloud storage concepts
14+
- Familiarity with SQL and data organization principles
15+
summary: |
16+
Build a solid foundation in Azure Databricks by understanding its architecture, integrations, compute options, and data organization capabilities. Learn how Azure Databricks provides a unified platform for data engineering, analytics, and AI workloads in the cloud.
17+
18+
In this learning path, you'll explore the fundamentals of Azure Databricks and how it fits into the modern data platform ecosystem. You'll start by provisioning workspaces and understanding core workloads, then dive into the architectural concepts that separate control and compute planes. You'll discover how Azure Databricks integrates seamlessly with Microsoft Fabric, Power BI, Visual Studio Code, and other Microsoft services to create comprehensive solutions. You'll learn to select and configure the right compute resources for your workloads, optimizing for both performance and cost. Finally, you'll master Unity Catalog's organizational structure to effectively manage your data assets. By the end, you'll have the foundational knowledge needed to build scalable data solutions on Azure Databricks.
19+
iconUrl: /training/achievements/generic-trophy.svg
20+
levels:
21+
- intermediate
22+
roles:
23+
- data-engineer
24+
products:
25+
- azure-databricks
26+
subjects:
27+
- data-engineering
28+
modules:
29+
- learn.wwl.explore-azure-databricks
30+
- learn.wwl.understand-azure-databricks-architecture
31+
- learn.wwl.understand-azure-databricks-integrations
32+
- learn.wwl.select-and-configure-compute
33+
- learn.wwl.create-and-organize-objects-in-unity-catalog
34+
trophy:
35+
uid: learn.wwl.azure-databricks-data-engineer-setup-configure-environment.trophy

learn-pr/wwl-data-ai/explore-azure-databricks/index.yml

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -3,7 +3,7 @@ uid: learn.wwl.explore-azure-databricks
33
metadata:
44
title: Explore Azure Databricks
55
description: Explore Azure Databricks
6-
ms.date: 09/12/2025
6+
ms.date: 12/09/2025
77
author: weslbo
88
ms.author: wedebols
99
ms.topic: module
@@ -18,7 +18,7 @@ abstract: |
1818
- Describe key concepts of an Azure Databricks solution
1919
prerequisites: |
2020
Before starting this module, you should have a fundamental knowledge of data analytics concepts. Consider completing [Azure Data Fundamentals certification](/credentials/certifications/azure-data-fundamentals?azure-portal=true) before starting this module.
21-
iconUrl: /learn/achievements/describe-azure-databricks.svg
21+
iconUrl: /learn/achievements/azure-databricks.svg
2222
levels:
2323
- intermediate
2424
roles:
Lines changed: 14 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,14 @@
1+
### YamlMime:ModuleUnit
2+
uid: learn.wwl.cleanse-transform-load-data-into-unity-catalog.introduction
3+
title: Introduction
4+
metadata:
5+
title: Introduction
6+
description: "Introduction to cleansing, transforming, and loading data into Unity Catalog"
7+
ms.date: 12/07/2025
8+
author: weslbo
9+
ms.author: wedebols
10+
ms.topic: unit
11+
ai-usage: ai-generated
12+
durationInMinutes: 2
13+
content: |
14+
[!include[](includes/1-introduction.md)]
Lines changed: 14 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,14 @@
1+
### YamlMime:ModuleUnit
2+
uid: learn.wwl.cleanse-transform-load-data-into-unity-catalog.summary
3+
title: Summary
4+
metadata:
5+
title: Summary
6+
description: "Summary"
7+
ms.date: 12/07/2025
8+
author: weslbo
9+
ms.author: wedebols
10+
ms.topic: unit
11+
ai-usage: ai-generated
12+
durationInMinutes: 2
13+
content: |
14+
[!include[](includes/10-summary.md)]
Lines changed: 14 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,14 @@
1+
### YamlMime:ModuleUnit
2+
uid: learn.wwl.cleanse-transform-load-data-into-unity-catalog.profile-data-summary-statistics
3+
title: Profile data
4+
metadata:
5+
title: Profile Data
6+
description: Learn how to profile data to generate summary statistics and assess data distributions in Azure Databricks using SQL commands and the data profiling feature.
7+
ms.date: 12/07/2025
8+
author: weslbo
9+
ms.author: wedebols
10+
ms.topic: unit
11+
ai-usage: ai-generated
12+
durationInMinutes: 6
13+
content: |
14+
[!include[](includes/2-profile-data-summary-statistics.md)]

0 commit comments

Comments
 (0)