Skip to content

Commit 9a6f09e

Browse files
committed
refresh azure stream analytics module
1 parent 5553b0f commit 9a6f09e

11 files changed

Lines changed: 21 additions & 11 deletions

File tree

learn-pr/wwl-data-ai/ingest-streaming-data-use-azure-stream-analytics-synapse/1-introduction.yml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -4,7 +4,7 @@ title: Introduction
44
metadata:
55
title: Introduction
66
description: "Introduction"
7-
ms.date: 08/21/2025
7+
ms.date: 04/15/2026
88
author: weslbo
99
ms.author: wedebols
1010
ms.topic: unit

learn-pr/wwl-data-ai/ingest-streaming-data-use-azure-stream-analytics-synapse/2-stream-ingestion-scenarios.yml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -4,7 +4,7 @@ title: Stream ingestion scenarios
44
metadata:
55
title: Stream Ingestion Scenarios
66
description: "Stream ingestion scenarios"
7-
ms.date: 08/21/2025
7+
ms.date: 04/15/2026
88
author: weslbo
99
ms.author: wedebols
1010
ms.topic: unit

learn-pr/wwl-data-ai/ingest-streaming-data-use-azure-stream-analytics-synapse/3-configure-inputs-outputs.yml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -4,7 +4,7 @@ title: Configure inputs and outputs
44
metadata:
55
title: Configure Inputs and Outputs
66
description: "Configure inputs and outputs"
7-
ms.date: 08/21/2025
7+
ms.date: 04/15/2026
88
author: weslbo
99
ms.author: wedebols
1010
ms.topic: unit

learn-pr/wwl-data-ai/ingest-streaming-data-use-azure-stream-analytics-synapse/4-define-query.yml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -4,7 +4,7 @@ title: Define a query to select, filter, and aggregate data
44
metadata:
55
title: Define a Query to Select, Filter, and Aggregate Data
66
description: "Define a query to select, filter, and aggregate data"
7-
ms.date: 08/21/2025
7+
ms.date: 04/15/2026
88
author: weslbo
99
ms.author: wedebols
1010
ms.topic: unit

learn-pr/wwl-data-ai/ingest-streaming-data-use-azure-stream-analytics-synapse/5-run-job-ingest.yml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -4,7 +4,7 @@ title: Run a job to ingest data
44
metadata:
55
title: Run a Job to Ingest Data
66
description: "Run a job to ingest data"
7-
ms.date: 08/21/2025
7+
ms.date: 04/15/2026
88
author: weslbo
99
ms.author: wedebols
1010
ms.topic: unit

learn-pr/wwl-data-ai/ingest-streaming-data-use-azure-stream-analytics-synapse/6-exercise-ingest-streaming-data.yml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -4,7 +4,7 @@ title: Exercise - Ingest streaming data into Azure Synapse Analytics
44
metadata:
55
title: Exercise - Ingest Streaming Data Into Azure Synapse Analytics
66
description: "Exercise - Ingest streaming data into Azure Synapse Analytics"
7-
ms.date: 08/21/2025
7+
ms.date: 04/15/2026
88
author: weslbo
99
ms.author: wedebols
1010
ms.topic: unit

learn-pr/wwl-data-ai/ingest-streaming-data-use-azure-stream-analytics-synapse/7-knowledge-check.yml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -4,7 +4,7 @@ title: Module assessment
44
metadata:
55
title: Module Assessment
66
description: "Knowledge check"
7-
ms.date: 08/21/2025
7+
ms.date: 04/15/2026
88
author: weslbo
99
ms.author: wedebols
1010
ms.topic: unit

learn-pr/wwl-data-ai/ingest-streaming-data-use-azure-stream-analytics-synapse/8-summary.yml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -4,7 +4,7 @@ title: Summary
44
metadata:
55
title: Summary
66
description: "Summary"
7-
ms.date: 08/21/2025
7+
ms.date: 04/15/2026
88
author: weslbo
99
ms.author: wedebols
1010
ms.topic: unit

learn-pr/wwl-data-ai/ingest-streaming-data-use-azure-stream-analytics-synapse/includes/3-configure-inputs-outputs.md

Lines changed: 5 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,13 +1,17 @@
11

22
All Azure Stream Analytics jobs include at least one input and output. In most cases, inputs reference sources of streaming data (though you can also define inputs for static reference data to augment the streamed event data). Outputs determine where the results of the stream processing query will be sent. In the case of data ingestion into Azure Synapse Analytics, the output usually references an Azure Data Lake Storage Gen2 container or a table in a dedicated SQL pool database.
33

4+
> [!NOTE]
5+
> Azure Stream Analytics offers two authoring experiences: the traditional SQL query editor covered in this module, and a no-code drag-and-drop editor. The no-code editor lets you build complete jobs — including inputs, transformations, and Synapse outputs — visually without writing SQL. You can access it from the **Overview** page of a Stream Analytics job in the Azure portal, or from Azure Event Hubs via **Process Data**. For more information, see [No-code stream processing in Azure Stream Analytics](/azure/stream-analytics/no-code-stream-processing).
6+
47
## Streaming data inputs
58

69
Inputs for streaming data consumed by Azure Stream Analytics can include:
710

811
- Azure Event Hubs
912
- Azure IoT Hubs
1013
- Azure Blob or Data Lake Gen 2 Storage
14+
- Apache Kafka
1115

1216
Depending on the specific input type, the data for each streamed event includes the event's data fields as well as input-specific metadata fields. For example, data consumed from an Azure Event Hubs input includes an **EventEnqueuedUtcTime** field indicating the time when the event was received in the event hub.
1317

@@ -18,7 +22,7 @@ Depending on the specific input type, the data for each streamed event includes
1822

1923
If you need to load the results of your stream processing into a table in a dedicated SQL pool, use an **Azure Synapse Analytics** output. The output configuration includes the identity of the dedicated SQL pool in an Azure Synapse Analytics workspace, details of how the Azure Stream Analytics job should establish an authenticated connection to it, and the existing table into which the data should be loaded.
2024

21-
Authentication to Azure Synapse Analytics is usually accomplished through SQL Server authentication, which requires a username and password. Alternatively, you can use a managed identity to authenticate. When using an Azure Synapse Analytics output, your Azure Stream Analytics job configuration must include an Azure Storage account in which authentication metadata for the job is stored securely.
25+
The recommended authentication method is **managed identity**, which eliminates password management overhead and avoids the 90-day token expiration that affects user-based authentication methods. Using managed identity also enables fully automated Stream Analytics deployments without embedded credentials. Alternatively, you can use SQL Server authentication with a username and password. When using an Azure Synapse Analytics output, your Azure Stream Analytics job configuration must include an Azure Storage account in which authentication metadata for the job is stored securely.
2226

2327
> [!NOTE]
2428
> For more information about using an Azure Synapse Analytics output, see [Azure Synapse Analytics output from Azure Stream Analytics](/azure/stream-analytics/azure-synapse-analytics-output?azure-portal=true) in the Azure Stream Analytics documentation.

learn-pr/wwl-data-ai/ingest-streaming-data-use-azure-stream-analytics-synapse/includes/4-define-query.md

Lines changed: 7 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -37,7 +37,13 @@ WHERE ReadingValue < 0
3737

3838
## Aggregating events over temporal windows
3939

40-
A common pattern for streaming queries is to aggregate event data over temporal (time-based) intervals, or *windows*. To accomplish this, you can use a **GROUP BY** clause that includes a Window function defining the kind of window you want to define (for example, *tumbling*, *hopping*, or *sliding*).
40+
A common pattern for streaming queries is to aggregate event data over temporal (time-based) intervals, or *windows*. To accomplish this, you can use a **GROUP BY** clause that includes a Window function defining the kind of window you want to define. Azure Stream Analytics supports five window types:
41+
42+
- **Tumbling** — fixed-size, non-overlapping, contiguous intervals. An event belongs to exactly one tumbling window.
43+
- **Hopping** — overlapping scheduled windows that advance by a fixed hop interval. Events can belong to more than one window.
44+
- **Sliding** — fire only when the window content changes (when an event arrives or leaves). Every window contains at least one event.
45+
- **Session** — variable-length windows grouped by inactivity gap between events. Useful for IoT and user-activity scenarios where event bursts are separated by idle periods.
46+
- **Snapshot** — groups events that share the same timestamp using `System.Timestamp()`. Typically used after a preceding window function to further aggregate its output.
4147

4248
> [!TIP]
4349
> For more information about window functions, see [Introduction to Stream Analytics windowing functions](/azure/stream-analytics/stream-analytics-window-functions?azure-portal=true) in the Azure Stream Analytics documentation.

0 commit comments

Comments
 (0)