You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: articles/iot-operations/connect-to-cloud/concept-dataflow-graphs.md
+7-7Lines changed: 7 additions & 7 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -41,15 +41,15 @@ For new projects that use supported endpoint types, we recommend data flow graph
41
41
42
42
Each transform is a pre-built processing step that you configure with rules and chain with other transforms inside a `DataflowGraph` resource.
43
43
44
-
<!--| Transform | What it does | Learn more |
44
+
| Transform | What it does | Learn more |
45
45
|-----------|-------------|------------|
46
46
|**Map**| Rename, restructure, compute, and copy fields |[Transform data with map](howto-dataflow-graphs-map.md)|
47
47
|**Filter**| Drop messages that match a condition |[Filter and route data](howto-dataflow-graphs-filter-route.md)|
48
48
|**Branch**| Route each message to a `true` or `false` path based on a condition |[Filter and route data](howto-dataflow-graphs-filter-route.md#branch-transform)|
49
49
|**Concat**| Merge two or more paths back into one |[Filter and route data](howto-dataflow-graphs-filter-route.md#merge-paths-with-concat)|
50
-
| **Window** | Collect messages over a time interval, then aggregate | [Aggregate data over time](howto-dataflow-graphs-window.md) |-->
50
+
|**Window**| Collect messages over a time interval, then aggregate |[Aggregate data over time](howto-dataflow-graphs-window.md)|
51
51
52
-
<!--All transforms share an [expression language](concept-dataflow-graphs-expressions.md) for operators, functions, and field references. You can also [enrich](howto-dataflow-graphs-enrich.md) messages with external data from a state store in map, filter, and branch transforms.-->
52
+
All transforms share an [expression language](concept-dataflow-graphs-expressions.md) for operators, functions, and field references. You can also [enrich](howto-dataflow-graphs-enrich.md) messages with external data from a state store in map, filter, and branch transforms.
53
53
54
54
## How transforms compose
55
55
@@ -69,7 +69,7 @@ Here's a complete example that reads temperature data from an MQTT topic, conver
69
69
70
70
# [Operations experience](#tab/portal)
71
71
72
-
<!-- -->
72
+

73
73
74
74
In the Operations experience:
75
75
@@ -182,7 +182,7 @@ spec:
182
182
183
183
The pipeline defines three elements: a source, a transform (indicated by `nodeType: Graph`), and a destination. The connections describe how data flows between them. The transform's `configuration` passes rules as a JSON string under the `rules` key.
184
184
185
-
<!-- In the how-to articles that follow, examples focus on the transform rules themselves. For a step-by-step guide to creating a data flow graph, see [Create a data flow graph](howto-create-dataflow-graph.md). -->
185
+
In the how-to articles that follow, examples focus on the transform rules themselves. For a step-by-step guide to creating a data flow graph, see [Create a data flow graph](howto-create-dataflow-graph.md).
186
186
187
187
## Built-in transforms vs. WASM transforms
188
188
@@ -223,7 +223,7 @@ To use data flow graphs, you need:
223
223
224
224
## Next steps
225
225
226
-
<!-- - [Data flows vs. data flow graphs](overview-dataflow-comparison.md)
226
+
- [Data flows vs. data flow graphs](overview-dataflow-comparison.md)
227
227
- [Create a data flow graph](howto-create-dataflow-graph.md)
228
228
- [Transform data with map](howto-dataflow-graphs-map.md)
229
229
- [Filter and route data](howto-dataflow-graphs-filter-route.md)
@@ -232,4 +232,4 @@ To use data flow graphs, you need:
Copy file name to clipboardExpand all lines: articles/iot-operations/connect-to-cloud/howto-configure-dataflow-destination.md
+6-6Lines changed: 6 additions & 6 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -230,7 +230,7 @@ In data flow graphs, you can set the destination topic dynamically based on mess
230
230
231
231
This approach is more flexible than source topic routing because it lets you set the destination topic based on any field or computed value in the message, not just the source topic structure.
232
232
233
-
<!-- For more information and complete examples, see [Route messages to different topics](howto-dataflow-graphs-topic-routing.md). -->
233
+
For more information and complete examples, see [Route messages to different topics](howto-dataflow-graphs-topic-routing.md).
234
234
235
235
## Serialize the output with a schema
236
236
@@ -247,7 +247,7 @@ Specify the schema and serialization format in the data flow endpoint details. T
247
247
248
248
# [Azure CLI](#tab/cli)
249
249
250
-
<!-- After you [upload a schema to the schema registry](concept-schema-registry.md#upload-a-schema), reference it in the data flow configuration. -->
250
+
After you [upload a schema to the schema registry](concept-schema-registry.md#upload-a-schema), reference it in the data flow configuration.
251
251
252
252
```json
253
253
{
@@ -260,7 +260,7 @@ Specify the schema and serialization format in the data flow endpoint details. T
260
260
261
261
# [Bicep](#tab/bicep)
262
262
263
-
<!-- After you [upload a schema to the schema registry](concept-schema-registry.md#upload-a-schema), reference it in the data flow configuration. -->
263
+
After you [upload a schema to the schema registry](concept-schema-registry.md#upload-a-schema), reference it in the data flow configuration.
Copy file name to clipboardExpand all lines: articles/iot-operations/connect-to-cloud/howto-configure-dataflow-source.md
+3-3Lines changed: 3 additions & 3 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -18,7 +18,7 @@ ai-usage: ai-assisted
18
18
19
19
The source is where data enters a data flow or data flow graph. You configure the source by specifying an endpoint reference and a list of data sources (topics) for that endpoint.
20
20
21
-
<!--This page applies to both [data flows](overview-dataflow.md) and [data flow graphs](concept-dataflow-graphs.md). For data flows, the source is an operation in the `Dataflow` resource. For data flow graphs, the source is a `Source` node in the `DataflowGraph` resource.-->
21
+
This page applies to both [data flows](overview-dataflow.md) and [data flow graphs](concept-dataflow-graphs.md). For data flows, the source is an operation in the `Dataflow` resource. For data flow graphs, the source is a `Source` node in the `DataflowGraph` resource.
22
22
23
23
> [!IMPORTANT]
24
24
> Data flows support MQTT and Kafka source endpoints. Data flow graphs support MQTT, Kafka, and OpenTelemetry source endpoints. Each data flow must have the Azure IoT Operations local MQTT broker default endpoint as either the source or destination. For more information, see [Data flows must use local MQTT broker endpoint](./howto-configure-dataflow-endpoint.md#data-flows-must-use-local-mqtt-broker-endpoint).
@@ -436,6 +436,6 @@ For more information, see [Understand message schemas](concept-schema-registry.m
436
436
437
437
## Next steps
438
438
439
-
<!-- - [Configure a data flow destination](howto-configure-dataflow-destination.md)
439
+
- [Configure a data flow destination](howto-configure-dataflow-destination.md)
0 commit comments