You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: articles/stream-analytics/stream-analytics-solution-patterns.md
+57-10Lines changed: 57 additions & 10 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -1,11 +1,11 @@
1
1
---
2
2
title: Azure Stream Analytics solution patterns
3
-
description: Learn about common solution patterns for Azure Stream Analytics, such as dashboarding, event messaging, data stores, reference data enrichment, and monitoring.
3
+
description: Learn about common solution patterns for Azure Stream Analytics, including dashboarding, event messaging, data stores, Delta Lake, Microsoft Fabric, and monitoring.
4
4
author: ahartoon
5
5
ms.author: anboisve
6
6
ms.service: azure-stream-analytics
7
7
ms.topic: concept-article
8
-
ms.date: 02/19/2025
8
+
ms.date: 03/24/2026
9
9
---
10
10
11
11
# Azure Stream Analytics solution patterns
@@ -75,9 +75,9 @@ This pattern can also be used to implement a rules engine where the thresholds o
75
75
76
76
## Add Machine Learning to your real-time insights
77
77
78
-
Azure Stream Analytics' built-in [Anomaly Detection model](stream-analytics-machine-learning-anomaly-detection.md) is a convenient way to introduce Machine Learning to your real-time application. For a wider range of Machine Learning needs, see [Azure Stream Analytics integrates with Azure Machine Learning's scoring service](stream-analytics-machine-learning-integration-tutorial.md).
78
+
Azure Stream Analytics' built-in [Anomaly Detection model](stream-analytics-machine-learning-anomaly-detection.md) is a convenient way to introduce Machine Learning to your real-time application. For a wider range of Machine Learning needs, see [Azure Stream Analytics integration with Azure Machine Learning](stream-analytics-machine-learning-integration-tutorial.md). You can deploy models from Azure Machine Learning and call them as user-defined functions (UDFs) in your Stream Analytics queries.
79
79
80
-
For advanced users who want to incorporate online training and scoring into the same Stream Analytics pipeline, see this example of how do that with [linear regression](stream-analytics-high-frequency-trading.md).
80
+
For advanced users who want to incorporate online training and scoring into the same Stream Analytics pipeline, see this example of how to do that with [linear regression](stream-analytics-high-frequency-trading.md).
81
81
82
82
:::image type="content" source="media/stream-analytics-solution-patterns/machine-learning-app.png" alt-text="Diagram that shows an Azure Stream Analytics job using an ML scoring model.":::
83
83
@@ -89,7 +89,7 @@ Another common pattern is real-time data warehousing, also called streaming data
89
89
90
90
## Archiving real-time data for analytics
91
91
92
-
Most data science and analytics activities still happen offline. You can archive data in Azure Stream Analytics through Azure Data Lake Store Gen2 output and Parquet output formats. This capability removes the friction to feed data directly into Azure Data Lake Analytics, Azure Databricks, and Azure HDInsight. Azure Stream Analytics is used as a near real-time Extract-Transform-Load (ETL) engine in this solution. You can explore archived data in Data Lake using various compute engines.
92
+
Most data science and analytics activities still happen offline. You can archive data in Azure Stream Analytics through Azure Data Lake Store Gen2 output and Parquet output formats. This capability removes the friction to feed data directly into Azure Synapse Analytics, Azure Databricks, Microsoft Fabric, and Azure HDInsight. Azure Stream Analytics is used as a near real-time Extract-Transform-Load (ETL) engine in this solution. You can explore archived data in Data Lake using various compute engines.
93
93
94
94
:::image type="content" source="media/stream-analytics-solution-patterns/offline-analytics.png" alt-text="Diagram that shows archiving of real-time data from a Stream Analytics job.":::
95
95
@@ -105,9 +105,38 @@ If you combine the offline analytics pattern with the near real-time application
105
105
106
106
:::image type="content" source="media/stream-analytics-solution-patterns/insights-operationalization.png" alt-text="Diagram that shows both cold path and hot path in a Stream Analytics solution.":::
107
107
108
+
## Apache Kafka integration
109
+
110
+
Stream Analytics supports [Apache Kafka](../event-hubs/azure-event-hubs-kafka-overview.md) as both input and output through Azure Event Hubs with Kafka endpoint. This pattern enables:
111
+
112
+
- Migration from existing Kafka-based architectures to Azure
113
+
- Hybrid scenarios connecting on-premises Kafka clusters to Azure
114
+
- Integration with Apache Kafka ecosystem tools and connectors
115
+
116
+
## Delta Lake output for lakehouse architectures
117
+
118
+
For modern lakehouse architectures, Stream Analytics can write directly to [Delta Lake format](write-to-delta-lake.md) in Azure Data Lake Storage Gen2. Delta Lake provides:
119
+
120
+
- ACID transactions for reliable data ingestion
121
+
- Schema enforcement and evolution
122
+
- Time travel capabilities for data versioning
123
+
- Unified batch and streaming data access
124
+
125
+
### Choosing the right pattern
126
+
127
+
Use this table to help select the appropriate pattern for your scenario:
128
+
129
+
| Scenario | Recommended pattern | Key benefit |
130
+
|----------|-------------------|-------------|
131
+
| Real-time dashboards | Power BI streaming dataset | Lowest latency |
132
+
| Complex reporting | SQL Database + Power BI | Full BI capabilities |
An Azure Stream Analytics job can be run 24/7 to process incoming events continuously in real time. Its uptime guarantee is crucial to the health of the overall application. While Stream Analytics is the only streaming analytics service in the industry that offers a [99.9% availability guarantee](https://azure.microsoft.com/support/legal/sla/stream-analytics/v1_0/), you still incur some level of down time. Over the years, Stream Analytics has introduced metrics, logs, and job states to reflect the health of the jobs. All of them are surfaced through Azure Monitor service and can be further exported to OMS. For more information, see [Monitor Stream Analytics job with Azure portal](./stream-analytics-monitoring.md).
139
+
An Azure Stream Analytics job can be run 24/7 to process incoming events continuously in real time. Its uptime guarantee is crucial to the health of the overall application. While Stream Analytics is the only streaming analytics service in the industry that offers a [99.9% availability guarantee](https://azure.microsoft.com/support/legal/sla/stream-analytics/v1_0/), you still incur some level of down time. Over the years, Stream Analytics has introduced metrics, logs, and job states to reflect the health of the jobs. All of them are surfaced through the Azure Monitor service and can be exported to a Log Analytics workspace for deeper analysis. For more information, see [Monitor Stream Analytics job with Azure portal](./stream-analytics-monitoring.md).
111
140
112
141
:::image type="content" source="media/stream-analytics-solution-patterns/monitoring.png" alt-text="Diagram that shows monitoring of Stream Analytics jobs.":::
113
142
@@ -121,6 +150,22 @@ There are two key things to monitor:
121
150
122
151
This metric reflects how far behind your processing pipeline is in wall clock time (seconds). Some of the delay is attributed to the inherent processing logic. As a result, monitoring the increasing trend is much more important than monitoring the absolute value. The steady state delay should be addressed by your application design, not by monitoring or alerts.
123
152
153
+
### Set up alerts and dashboards
154
+
155
+
Configure Azure Monitor alerts for proactive monitoring:
156
+
157
+
1.**SU utilization** - Alert when sustained above 80% to prevent job failures
158
+
2.**Watermark delay** - Alert on increasing trends that indicate processing lag
159
+
3.**Input/Output events** - Monitor for sudden drops indicating connectivity issues
160
+
4.**Runtime errors** - Track deserialization and data conversion failures
161
+
162
+
For centralized observability, export Stream Analytics metrics and logs to a Log Analytics workspace. This enables:
163
+
164
+
- Cross-job correlation and analysis
165
+
- Custom Kusto queries for deep diagnostics
166
+
- Integration with Azure dashboards and workbooks
167
+
168
+
124
169
Upon failure, activity logs and [diagnostics logs](stream-analytics-job-diagnostic-logs.md) are the best places to begin looking for errors.
125
170
126
171
## Build resilient and mission critical applications
@@ -174,8 +219,10 @@ The key is to design your system in composable patterns, so each subsystem can b
174
219
175
220
## Next steps
176
221
177
-
You now have seen various solution patterns using Azure Stream Analytics. Next, you can dive deep and create your first Stream Analytics job:
222
+
You've learned about various solution patterns using Azure Stream Analytics. Next, you can dive deep and create your first Stream Analytics job:
178
223
179
-
*[Create a Stream Analytics job by using the Azure portal](stream-analytics-quick-create-portal.md).
180
-
*[Create a Stream Analytics job by using Azure PowerShell](stream-analytics-quick-create-powershell.md).
181
-
*[Create a Stream Analytics job by using Visual Studio](stream-analytics-quick-create-vs.md).
224
+
*[Create a Stream Analytics job by using the Azure portal](stream-analytics-quick-create-portal.md)
225
+
*[Build a no-code stream processing pipeline](no-code-stream-processing.md)
226
+
*[Output to Delta Lake format](write-to-delta-lake.md)
227
+
*[Use managed identities to secure your job](stream-analytics-managed-identities-overview.md)
0 commit comments