You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: articles/data-factory/airflow-configurations.md
+7-3Lines changed: 7 additions & 3 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -10,11 +10,15 @@ ms.date: 02/13/2025
10
10
11
11
# Supported Apache Airflow configurations
12
12
13
-
> [!NOTE]
14
-
> This feature is in public preview. Workflow Orchestration Manager is powered by Apache Airflow.
15
-
16
13
In Workflow Orchestration Manager, Apache Airflow configurations can be integrated with the platform's runtime as key-value pairs. While the `airflow.cfg` isn't directly accessible in the UI, users can override these configurations via the UI's "Airflow Configuration overrides" section, retaining access to other `airflow.cfg` settings. Developers have the flexibility to override most Apache Airflow configurations within Workflow Orchestration Manager, `except for those explicitly outlined in a provided table`.
17
14
15
+
> [!IMPORTANT]
16
+
> Workflow Orchestration Manager (powered by Apache Airflow) will be permanently retired in Azure Data Factory on December 31, 2025. The feature is now available in Microsoft Fabric. [Learn more](https://learn.microsoft.com/fabric/data-factory/apache-airflow-jobs-concepts)
17
+
>
18
+
> We recommend that you migrate all Workflow Orchestration Manager (Apache Airflow in Azure Data Factory) workloads to Data Workflows (Apache Airflow in Microsoft Fabric) to benefit from expanded capabilities before December 31, 2025.
19
+
>
20
+
> For more information or for support during your migration to Apache Airflow in Microsoft Fabric, contact Microsoft Support.
For more information on Apache Airflow configurations, see [Configuration Reference](https://airflow.apache.org/docs/apache-airflow/stable/configurations-ref.html).
> This feature is in public preview. Workflow Orchestration Manager is powered by Apache Airflow.
18
-
19
-
This article demonstrates how to retrieve and add the IP address associated with your Workflow Orchestration Manager cluster to your storage firewall's allowlist. This enhances the security of data stores and resources by restricting access solely to the Workflow Orchestration Manager cluster within Azure Data Factory, preventing access from all other IP addresses via the public endpoint.
16
+
> [!IMPORTANT]
17
+
> Workflow Orchestration Manager (powered by Apache Airflow) will be permanently retired in Azure Data Factory on December 31, 2025. The feature is now available in Microsoft Fabric. [Learn more](https://learn.microsoft.com/fabric/data-factory/apache-airflow-jobs-concepts)
18
+
>
19
+
> We recommend that you migrate all Workflow Orchestration Manager (Apache Airflow in Azure Data Factory) workloads to Data Workflows (Apache Airflow in Microsoft Fabric) to benefit from expanded capabilities before December 31, 2025.
20
+
>
21
+
> For more information or for support during your migration to Apache Airflow in Microsoft Fabric, contact Microsoft Support.
20
22
21
-
> [!NOTE]
22
-
> Importing DAGs is currently not supported by using blob storage with IP allow listing or by using private endpoints. We suggest using Git sync instead.
Copy file name to clipboardExpand all lines: articles/data-factory/airflow-import-dags-blob-storage.md
+7-3Lines changed: 7 additions & 3 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -10,11 +10,15 @@ ms.date: 02/13/2025
10
10
11
11
# Import DAGs by using Azure Blob Storage
12
12
13
-
> [!NOTE]
14
-
> This feature is in public preview. Workflow Orchestration Manager is powered by Apache Airflow.
15
-
16
13
This article shows you step-by-step instructions on how to import directed acyclic graphs (DAGs) into Workflow Orchestration Manager by using Azure Blob Storage.
17
14
15
+
> [!IMPORTANT]
16
+
> Workflow Orchestration Manager (powered by Apache Airflow) will be permanently retired in Azure Data Factory on December 31, 2025. The feature is now available in Microsoft Fabric. [Learn more](https://learn.microsoft.com/fabric/data-factory/apache-airflow-jobs-concepts)
17
+
>
18
+
> We recommend that you migrate all Workflow Orchestration Manager (Apache Airflow in Azure Data Factory) workloads to Data Workflows (Apache Airflow in Microsoft Fabric) to benefit from expanded capabilities before December 31, 2025.
19
+
>
20
+
> For more information or for support during your migration to Apache Airflow in Microsoft Fabric, contact Microsoft Support.
21
+
18
22
## Prerequisites
19
23
20
24
-**Azure subscription**: If you don't have an Azure subscription, create a [free Azure account](https://azure.microsoft.com/pricing/purchase-options/azure-account?cid=msft_learn) before you begin.
> This feature is in public preview. Workflow Orchestration Manager is powered by Apache Airflow.
17
-
18
-
A python package is a way to organize related Python modules into a single directory hierarchy. A package is typically represented as a directory that contains a special file called `__init__.py`. Inside a package directory, you can have multiple Python module files (.py files) that define functions, classes, and variables.
19
-
In the context of Workflow Orchestration Manager, you can create packages to add your custom code.
15
+
A python package is a way to organize related Python modules into a single directory hierarchy. A package is typically represented as a directory that contains a special file called `__init__.py`. Inside a package directory, you can have multiple Python module files (.py files) that define functions, classes, and variables. In the context of Workflow Orchestration Manager, you can create packages to add your custom code.
16
+
17
+
> [!IMPORTANT]
18
+
> Workflow Orchestration Manager (powered by Apache Airflow) will be permanently retired in Azure Data Factory on December 31, 2025. The feature is now available in Microsoft Fabric. [Learn more](https://learn.microsoft.com/fabric/data-factory/apache-airflow-jobs-concepts)
19
+
>
20
+
> We recommend that you migrate all Workflow Orchestration Manager (Apache Airflow in Azure Data Factory) workloads to Data Workflows (Apache Airflow in Microsoft Fabric) to benefit from expanded capabilities before December 31, 2025.
21
+
>
22
+
> For more information or for support during your migration to Apache Airflow in Microsoft Fabric, contact Microsoft Support.
20
23
21
24
This guide provides step-by-step instructions on installing `.whl` (Wheel) file, which serve as a binary distribution format for Python package in your Workflow Orchestration Manager.
> This feature is in public preview. Workflow Orchestration Manager is powered by Apache Airflow.
17
-
18
-
This article describes the pricing for Workflow Orchestration Manager usage within data factory.
15
+
This article describes the pricing for Workflow Orchestration Manager usage within Azure Data Factory.
16
+
17
+
> [!IMPORTANT]
18
+
> Workflow Orchestration Manager (powered by Apache Airflow) will be permanently retired in Azure Data Factory on December 31, 2025. The feature is now available in Microsoft Fabric. [Learn more](https://learn.microsoft.com/fabric/data-factory/apache-airflow-jobs-concepts)
19
+
>
20
+
> We recommend that you migrate all Workflow Orchestration Manager (Apache Airflow in Azure Data Factory) workloads to Data Workflows (Apache Airflow in Microsoft Fabric) to benefit from expanded capabilities before December 31, 2025.
21
+
>
22
+
> For more information or for support during your migration to Apache Airflow in Microsoft Fabric, contact Microsoft Support.
> This feature is in public preview. Workflow Orchestration Manager is powered by Apache Airflow.
18
-
19
-
In this article, you learn how to synchronize your GitHub repository in Azure Data Factory Workflow Orchestration Manager in two different ways:
20
-
21
-
- By using **Enable git sync** in the Workflow Orchestration Manager UI.
22
-
- By using the Rest API.
16
+
> [!IMPORTANT]
17
+
> Workflow Orchestration Manager (powered by Apache Airflow) will be permanently retired in Azure Data Factory on December 31, 2025. The feature is now available in Microsoft Fabric. [Learn more](https://learn.microsoft.com/fabric/data-factory/apache-airflow-jobs-concepts)
18
+
>
19
+
> We recommend that you migrate all Workflow Orchestration Manager (Apache Airflow in Azure Data Factory) workloads to Data Workflows (Apache Airflow in Microsoft Fabric) to benefit from expanded capabilities before December 31, 2025.
20
+
>
21
+
> For more information or for support during your migration to Apache Airflow in Microsoft Fabric, contact Microsoft Support.
Copy file name to clipboardExpand all lines: articles/data-factory/ci-cd-pattern-with-airflow.md
+7-3Lines changed: 7 additions & 3 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -10,13 +10,17 @@ ms.date: 01/29/2025
10
10
11
11
# CI/CD patterns with Workflow Orchestration Manager
12
12
13
-
> [!NOTE]
14
-
> Workflow Orchestration Manager is powered by Apache Airflow.
15
-
16
13
Workflow Orchestration Manager provides a simple and efficient way to create and manage Apache Airflow environments. The service enables you to run data pipelines at scale with ease. There are two primary methods to run DAGs in Workflow Orchestration Manager. You can upload the DAG files in your blob storage and link them with the Airflow environment. Alternatively, you can use the Git-sync feature to automatically sync your Git repository with the Airflow environment.
17
14
18
15
Working with data pipelines in Airflow requires you to create or update your DAGs, plugins, and requirement files frequently, based on your workflow needs. Although developers can manually upload or edit DAG files in blob storage, many organizations prefer to use a continuous integration and continuous delivery (CI/CD) approach for code deployment. This article walks you through the recommended deployment patterns to seamlessly integrate and deploy your Apache Airflow DAGs with Workflow Orchestration Manager.
19
16
17
+
> [!IMPORTANT]
18
+
> Workflow Orchestration Manager (powered by Apache Airflow) will be permanently retired in Azure Data Factory on December 31, 2025. The feature is now available in Microsoft Fabric. [Learn more](https://learn.microsoft.com/fabric/data-factory/apache-airflow-jobs-concepts)
19
+
>
20
+
> We recommend that you migrate all Workflow Orchestration Manager (Apache Airflow in Azure Data Factory) workloads to Data Workflows (Apache Airflow in Microsoft Fabric) to benefit from expanded capabilities before December 31, 2025.
21
+
>
22
+
> For more information or for support during your migration to Apache Airflow in Microsoft Fabric, contact Microsoft Support.
> Apache Airflow is now accessible through Microsoft Fabric. Microsoft Fabric offers a wide range of Apache Airflow capabilities via Data Workflows.
19
-
> We recommend migrating your existing Workflow Orchestration Manager (Apache Airflow in ADF) based workflows to Data Workflows (Apache Airflow in Microsoft Fabric) for a broader set of features. Apache Airflow capabilities will be Generally Available in Q1 CY2025 only in Microsoft Fabric.
20
-
> For new Apache Airflow projects, we recommend using Apache Airflow in Microsoft Fabric. More details can be found [here](https://blog.fabric.microsoft.com/blog/introducing-data-workflows-in-microsoft-fabric?ft=All).
21
-
> New users will not be allowed to create a new workflow orchestration manager in ADF, but existing users with a workflow orchestration manager may continue to use it but plan a migration soon.
22
-
23
-
> [!NOTE]
24
-
> Workflow Orchestration Manager for Azure Data Factory relies on the open source Apache Airflow application. Documentation and more tutorials for Airflow can be found on the Apache Airflow [Documentation](https://airflow.apache.org/docs/) or [Community](https://airflow.apache.org/community/) pages.
25
-
26
17
Azure Data Factory offers serverless pipelines for data process orchestration, data movement with 100+ managed connectors, and visual transformations with the mapping data flow.
27
18
28
19
Azure Data Factory's Workflow Orchestration Manager service is a simple and efficient way to create and manage [Apache Airflow](https://airflow.apache.org) environments, enabling you to run data pipelines at scale with ease.
29
20
[Apache Airflow](https://airflow.apache.org) is an open-source platform used to programmatically create, schedule, and monitor complex data workflows. It allows you to define a set of tasks, called operators, that can be combined into directed acyclic graphs (DAGs) to represent data pipelines. Airflow enables you to execute these DAGs on a schedule or in response to an event, monitor the progress of workflows, and provide visibility into the state of each task. It's widely used in data engineering and data science to orchestrate data pipelines, and is known for its flexibility, extensibility, and ease of use.
30
21
31
22
:::image type="content" source="media/concepts-workflow-orchestration-manager/data-integration.png" alt-text="Screenshot shows data integration.":::
32
23
24
+
25
+
> [!IMPORTANT]
26
+
> Workflow Orchestration Manager (powered by Apache Airflow) will be permanently retired in Azure Data Factory on December 31, 2025. The feature is now available in Microsoft Fabric. [Learn more](https://learn.microsoft.com/fabric/data-factory/apache-airflow-jobs-concepts)
27
+
>
28
+
> We recommend that you migrate all Workflow Orchestration Manager (Apache Airflow in Azure Data Factory) workloads to Data Workflows (Apache Airflow in Microsoft Fabric) to benefit from expanded capabilities before December 31, 2025.
29
+
>
30
+
> For more information or for support during your migration to Apache Airflow in Microsoft Fabric, contact Microsoft Support.
31
+
33
32
## When to use Workflow Orchestration Manager?
34
33
35
34
Azure Data Factory offers [Pipelines](concepts-pipelines-activities.md) to visually orchestrate data processes (UI-based authoring). While Workflow Orchestration Manager, offers Airflow based python DAGs (python code-centric authoring) for defining the data orchestration process. If you have the Airflow background, or are currently using Apache Airflow, you might prefer to use the Workflow Orchestration Manager instead of the pipelines. On the contrary, if you wouldn't like to write/ manage python-based DAGs for data process orchestration, you might prefer to use pipelines.
Copy file name to clipboardExpand all lines: articles/data-factory/create-airflow-environment.md
+7-3Lines changed: 7 additions & 3 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -9,11 +9,15 @@ ms.date: 10/20/2023
9
9
10
10
# Create an Airflow environment in Workflow Orchestration Manager
11
11
12
-
> [!NOTE]
13
-
> Workflow Orchestration Manager is powered by Apache Airflow.
14
-
15
12
This article describes how to set up and configure an Airflow environment in Workflow Orchestration Manager.
16
13
14
+
> [!IMPORTANT]
15
+
> Workflow Orchestration Manager (powered by Apache Airflow) will be permanently retired in Azure Data Factory on December 31, 2025. The feature is now available in Microsoft Fabric. [Learn more](https://learn.microsoft.com/fabric/data-factory/apache-airflow-jobs-concepts)
16
+
>
17
+
> We recommend that you migrate all Workflow Orchestration Manager (Apache Airflow in Azure Data Factory) workloads to Data Workflows (Apache Airflow in Microsoft Fabric) to benefit from expanded capabilities before December 31, 2025.
18
+
>
19
+
> For more information or for support during your migration to Apache Airflow in Microsoft Fabric, contact Microsoft Support.
20
+
17
21
## Prerequisites
18
22
19
23
-**Azure subscription**: If you don't have an Azure subscription, create a [free account](https://azure.microsoft.com/pricing/purchase-options/azure-account?cid=msft_learn) before you begin.
Copy file name to clipboardExpand all lines: articles/data-factory/delete-dags-in-workflow-orchestration-manager.md
+8-4Lines changed: 8 additions & 4 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -11,10 +11,14 @@ ms.subservice: orchestration
11
11
12
12
# Delete files in Workflow Orchestration Manager
13
13
14
-
> [!NOTE]
15
-
> Workflow Orchestration Manager is powered by Apache Airflow.
16
-
17
-
This article walks you through the steps to delete directed acyclic graph (DAG) files in Workflow Orchestration Manager environment.
14
+
This article walks you through the steps to delete directed acyclic graph (DAG) files in a Workflow Orchestration Manager environment.
15
+
16
+
> [!IMPORTANT]
17
+
> Workflow Orchestration Manager (powered by Apache Airflow) will be permanently retired in Azure Data Factory on December 31, 2025. The feature is now available in Microsoft Fabric. [Learn more](https://learn.microsoft.com/fabric/data-factory/apache-airflow-jobs-concepts)
18
+
>
19
+
> We recommend that you migrate all Workflow Orchestration Manager (Apache Airflow in Azure Data Factory) workloads to Data Workflows (Apache Airflow in Microsoft Fabric) to benefit from expanded capabilities before December 31, 2025.
20
+
>
21
+
> For more information or for support during your migration to Apache Airflow in Microsoft Fabric, contact Microsoft Support.
0 commit comments