Skip to content

Commit 7c853f7

Browse files
committed
added videos
1 parent 6b37a12 commit 7c853f7

13 files changed

Lines changed: 22 additions & 10 deletions

learn-pr/wwl-databricks/implement-manage-data-quality-constraints-unity-catalog/1-introduction.yml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -4,7 +4,7 @@ title: Introduction
44
metadata:
55
title: Introduction
66
description: "Introduction to implementing and managing data quality constraints in Unity Catalog"
7-
ms.date: 12/07/2025
7+
ms.date: 02/19/2026
88
author: weslbo
99
ms.author: wedebols
1010
ms.topic: unit

learn-pr/wwl-databricks/implement-manage-data-quality-constraints-unity-catalog/2-implement-validation-checks.yml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -4,7 +4,7 @@ title: Implement validation checks
44
metadata:
55
title: Implement Validation Checks
66
description: Learn how to implement validation checks including nullability, data cardinality, and range checking in Azure Databricks using pipeline expectations and table constraints.
7-
ms.date: 12/07/2025
7+
ms.date: 02/19/2026
88
author: weslbo
99
ms.author: wedebols
1010
ms.topic: unit

learn-pr/wwl-databricks/implement-manage-data-quality-constraints-unity-catalog/3-implement-data-type-checks.yml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -4,7 +4,7 @@ title: Implement data type checks
44
metadata:
55
title: Implement Data Type Checks
66
description: Learn how to implement data type checks in Azure Databricks using schema enforcement, explicit casting, constraints, and pipeline expectations to validate data quality.
7-
ms.date: 12/07/2025
7+
ms.date: 02/19/2026
88
author: weslbo
99
ms.author: wedebols
1010
ms.topic: unit

learn-pr/wwl-databricks/implement-manage-data-quality-constraints-unity-catalog/4-detect-manage-schema-drift.yml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -4,7 +4,7 @@ title: Detect and manage schema drift
44
metadata:
55
title: Detect and Manage Schema Drift
66
description: Learn how to detect and manage schema drift in Azure Databricks data pipelines using Delta Lake, Auto Loader, schema evolution, and error handling strategies.
7-
ms.date: 12/07/2025
7+
ms.date: 02/19/2026
88
author: weslbo
99
ms.author: wedebols
1010
ms.topic: unit

learn-pr/wwl-databricks/implement-manage-data-quality-constraints-unity-catalog/5-manage-data-quality-pipeline-expectations.yml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -4,7 +4,7 @@ title: Manage data quality with pipeline expectations
44
metadata:
55
title: Manage Data Quality with Pipeline Expectations
66
description: Learn how to use pipeline expectations in Lakeflow Spark Declarative Pipelines to define and enforce data quality rules for streaming tables and materialized views.
7-
ms.date: 12/07/2025
7+
ms.date: 02/19/2026
88
author: weslbo
99
ms.author: wedebols
1010
ms.topic: unit

learn-pr/wwl-databricks/implement-manage-data-quality-constraints-unity-catalog/6-knowledge-check.yml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -4,7 +4,7 @@ title: Module assessment
44
metadata:
55
title: Module assessment
66
description: "Knowledge check"
7-
ms.date: 12/07/2025
7+
ms.date: 02/19/2026
88
author: weslbo
99
ms.author: wedebols
1010
ms.topic: unit

learn-pr/wwl-databricks/implement-manage-data-quality-constraints-unity-catalog/7-summary.yml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -4,7 +4,7 @@ title: Summary
44
metadata:
55
title: Summary
66
description: "Summary"
7-
ms.date: 12/07/2025
7+
ms.date: 02/19/2026
88
author: weslbo
99
ms.author: wedebols
1010
ms.topic: unit

learn-pr/wwl-databricks/implement-manage-data-quality-constraints-unity-catalog/includes/1-introduction.md

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -1,3 +1,5 @@
1+
>[!VIDEO https://learn-video.azurefd.net/vod/player?id=8f94718e-160e-4b3f-902f-fb1d807a3107]
2+
13
**Data quality** issues can derail analytics projects, corrupt business reports, and erode confidence in your data platform. When invalid data enters your tables—whether through type mismatches, missing required values, or unexpected schema changes—problems compound as that data flows through pipelines and reaches downstream consumers. Implementing robust **data quality constraints** at the point of ingestion creates a foundation of trust in your data assets.
24

35
Azure Databricks provides multiple mechanisms for enforcing data quality within **Unity Catalog**. **Schema enforcement** validates data types automatically when writing to Delta Lake tables. **Table constraints** define rules that reject invalid records at write time. **Pipeline expectations** in Lakeflow Spark Declarative Pipelines enable real-time quality checks on streaming data with configurable actions for violations. Together, these capabilities form a comprehensive approach to maintaining data integrity.

learn-pr/wwl-databricks/implement-manage-data-quality-constraints-unity-catalog/includes/2-implement-validation-checks.md

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -1,3 +1,5 @@
1+
>[!VIDEO https://learn-video.azurefd.net/vod/player?id=18f27350-8c3e-41ce-aa74-80fe7ddb2ec8]
2+
13
Data pipelines often encounter records with missing values, duplicate identifiers, or values that fall outside acceptable ranges. Without validation checks in place, these data quality issues propagate downstream, causing incorrect analytics, failed reports, and unreliable business decisions. Validation checks catch these problems at the point of data ingestion, ensuring that only quality data flows through your pipeline.
24

35
In this unit, you learn how to implement validation checks for nullability, data cardinality, and range checking using Lakeflow Spark Declarative Pipelines expectations and Delta Lake table constraints.

learn-pr/wwl-databricks/implement-manage-data-quality-constraints-unity-catalog/includes/3-implement-data-type-checks.md

Lines changed: 6 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -1,3 +1,5 @@
1+
>[!VIDEO https://learn-video.azurefd.net/vod/player?id=9a912c64-3831-409b-a8b7-23d67acec030]
2+
13
Every column in your dataset has an expected data type, and when incoming data doesn't match that expectation, problems cascade downstream. A string value where an integer belongs can break calculations, corrupt aggregations, or cause pipeline failures. Data type checks ensure that each field contains values of the correct type before they enter your tables.
24

35
In this unit, you learn how to implement data type checks using schema enforcement, explicit type casting, and validation constraints in Azure Databricks.
@@ -75,9 +77,11 @@ For example, you can validate that a string column contains only numeric charact
7577
```sql
7678
CREATE TABLE orders (
7779
order_id INT,
78-
order_total STRING,
79-
CONSTRAINT valid_order_total CHECK (order_total REGEXP '^[0-9]+(\\.[0-9]+)?$')
80+
order_total STRING
8081
);
82+
83+
ALTER TABLE orders
84+
ADD CONSTRAINT valid_order_total CHECK (order_total REGEXP '^[0-9]+(\\.[0-9]+)?$')
8185
```
8286

8387
This constraint ensures the `order_total` column contains values that look like valid numbers, catching malformed data before it enters the table.

0 commit comments

Comments
 (0)