| title | Module 1 - Create a pipeline with Data Factory |
|---|---|
| description | In this tutorial, you create a pipeline to copy data as part of an end-to-end guide to complete a full data integration scenario within an hour using Data Factory in Microsoft Fabric. |
| ms.reviewer | xupzhou |
| ms.topic | tutorial |
| ms.date | 11/18/2025 |
| ms.custom | pipelines, sfi-image-nochange |
This module takes about 10 minutes to complete. You'll ingest raw data from the source store into a table in the bronze data layer of a data Lakehouse using the Copy activity in a pipeline.
The high-level steps in module 1 are:
- Create a pipeline.
- Create a copy job activity in the pipeline to load sample data into a data Lakehouse.
- Run and view the results of the copy job
- A [!INCLUDE product-name] tenant account with an active subscription. If you don't have one, you can try Microsoft Fabric for free.
- A [!INCLUDE product-name] enabled Workspace. Learn how to create a workspace.
- Access to Power BI.
-
Sign into Power BI.
-
Select the default Power BI icon at the bottom left of the screen, and select Fabric.
-
Select a workspace from the Workspaces tab or select My workspace, then select + New item, then search for and choose Pipeline.
:::image type="content" source="media/tutorial-end-to-end-pipeline/new-data-pipeline.png" alt-text="Screenshot of the Data Factory start page with the button to create a new pipeline selected." lightbox="media/tutorial-end-to-end-pipeline/new-data-pipeline.png":::
-
Provide a pipeline name. Then select Create.
-
Select Copy data assistant to open the copy assistant tool.
:::image type="content" source="media/tutorial-end-to-end-pipeline/open-copy-assistant.png" alt-text="Screenshot showing the selection of the copy data assistant from the new pipeline start page." lightbox="media/tutorial-end-to-end-pipeline/open-copy-assistant.png":::
-
On the Choose data source page, select Sample data from the options at the top of the dialog, and then select NYC Taxi - Green.
:::image type="content" source="media/tutorial-end-to-end-pipeline/select-sample-data-source.png" alt-text="Screenshot showing the selection of the NYC Taxi - Green data in the copy assistant on the Choose data source tab." lightbox="media/tutorial-end-to-end-pipeline/select-sample-data-source.png":::
-
The data source preview appears next on the Connect to data source page. Review, and then select Next.
:::image type="content" source="media/tutorial-end-to-end-pipeline/preview-data.png" alt-text="Screenshot showing the preview data for the NYC Taxi - Green sample dataset." lightbox="media/tutorial-end-to-end-pipeline/preview-data.png":::
-
For the Choose data destination step of the copy assistant, select Lakehouse.
-
Enter a Lakehouse name, then select Create and connect.
-
Select Connect.
-
Select Full copy for the copy job mode.
-
When mapping to destination, select Tables, select Append as the update method, and edit the table mapping so the destination table is named
Bronze. Then select Next.:::image type="content" source="media/tutorial-end-to-end-pipeline/choose-destination-table-details.png" alt-text="Screenshot showing the Connect to data destination tab of the Copy data assistant, on the Select and map to folder path or table step." lightbox="media/tutorial-end-to-end-pipeline/choose-destination-table-details.png":::
-
On the Review + save page of the copy data assistant, review the configuration and then select Save.
-
Select the copy job activity on the pipeline canvas, then select the Settings tab below the canvas.
:::image type="content" source="media/tutorial-end-to-end-pipeline/select-settings.png" alt-text="Screenshot of the pipeline canvas with the copy job activity highlighted and the settings tab highlighted." lightbox="media/tutorial-end-to-end-pipeline/select-settings.png":::
-
Select the Connection drop-down and select Browse all.
:::image type="content" source="media/tutorial-end-to-end-pipeline/browse-all.png" alt-text="Screenshot of the copy job activity settings list, with browse all highlighted." lightbox="media/tutorial-end-to-end-pipeline/browse-all.png":::
-
Select Copy job under New sources.
-
On the Connect data source page, select Sign in to authenticate the connection.
:::image type="content" source="media/tutorial-end-to-end-pipeline/select-sign-in.png" alt-text="Screenshot of the get data connection credentials page, with the Sign in Option highlighted." lightbox="media/tutorial-end-to-end-pipeline/select-sign-in.png":::
-
Follow the prompts to sign in to your organizational account.
-
Select Connect to complete the connection setup.
-
At the top of the pipeline editor, select Save to save the pipeline.
-
At the top of the pipeline editor, select Run to run the pipeline and copy the data.
[!NOTE] This copy can take over 30 minutes to complete.
:::image type="content" source="media/tutorial-end-to-end-pipeline/run-pipeline.png" alt-text="Screenshot of the pipeline editor with the Run button highlighted." lightbox="media/tutorial-end-to-end-pipeline/run-pipeline.png":::
-
You can monitor the run and check the results on the Output tab below the pipeline canvas. Select name of the pipeline to view the run details.
:::image type="content" source="media/tutorial-end-to-end-pipeline/run-details-button.png" alt-text="Screenshot showing the run details button in the pipeline Output tab." lightbox="media/tutorial-end-to-end-pipeline/run-details-button.png":::
Once the copy has completed, it can take around half an hour, continue to the next section to create your dataflow.
[!div class="nextstepaction"] Module 2: Transform data with a dataflow in Data Factory