Skip to content

Commit a86cae1

Browse files
committed
docs: overhaul data flow source doc headings, update UI terminology, add inputTopic cross-reference
- Promote H3 headings to H2 for better right-side navigation - Rename headings to be more direct (e.g., 'Subscribe to multiple topics' instead of 'Configure data sources (MQTT or Kafka topics)') - Update UI references from 'Message broker' to 'Data flow endpoint' to match current portal - Add 'Use the source topic in the destination path' section clarifying that ${inputTopic} works for both data flows and data flow graphs - Update internal anchor links
1 parent 6058a99 commit a86cae1

1 file changed

Lines changed: 26 additions & 20 deletions

File tree

articles/iot-operations/connect-to-cloud/howto-configure-dataflow-source.md

Lines changed: 26 additions & 20 deletions
Original file line numberDiff line numberDiff line change
@@ -6,7 +6,7 @@ ms.author: sethm
66
ms.service: azure-iot-operations
77
ms.subservice: azure-data-flows
88
ms.topic: how-to
9-
ms.date: 03/25/2026
9+
ms.date: 03/26/2026
1010
ai-usage: ai-assisted
1111

1212
#CustomerIntent: As an operator, I want to configure the source for a data flow or data flow graph.
@@ -19,22 +19,20 @@ ai-usage: ai-assisted
1919
The source is where data enters a data flow or data flow graph. You configure the source by specifying an endpoint reference and a list of data sources (topics) for that endpoint.
2020

2121
> [!TIP]
22-
> A single data flow source can subscribe to **multiple MQTT or Kafka topics** at once. You don't need to create separate data flows for each topic. Use the `dataSources` field (or **Topic(s)** > **Add row** in the operations experience) to add multiple topic filters, including wildcards. For more information, see [Configure MQTT or Kafka topics](#configure-data-sources-mqtt-or-kafka-topics).
22+
> A single data flow source can subscribe to **multiple MQTT or Kafka topics** at once. You don't need to create separate data flows for each topic. Use the `dataSources` field (or **Topic(s)** > **Add row** in the operations experience) to add multiple topic filters, including wildcards. For more information, see [Subscribe to multiple topics](#subscribe-to-multiple-topics).
2323
2424
This page applies to both [data flows](overview-dataflow.md) and [data flow graphs](concept-dataflow-graphs.md). For data flows, the source is an operation in the `Dataflow` resource. For data flow graphs, the source is a `Source` node in the `DataflowGraph` resource.
2525

2626
> [!IMPORTANT]
2727
> Data flows support MQTT and Kafka source endpoints. Data flow graphs support MQTT, Kafka, and OpenTelemetry source endpoints. Each data flow must have the Azure IoT Operations local MQTT broker default endpoint as either the source or destination. For more information, see [Data flows must use local MQTT broker endpoint](./howto-configure-dataflow-endpoint.md#data-flows-must-use-local-mqtt-broker-endpoint).
2828
29-
## Choose a source endpoint
30-
3129
You can use one of the following options as the source.
3230

33-
### Option 1: Use the default message broker endpoint
31+
## Use the default endpoint
3432

3533
# [Operations experience](#tab/portal)
3634

37-
1. Under **Source details**, select **Message broker**.
35+
1. Under **Source details**, select **Data flow endpoint**.
3836

3937
:::image type="content" source="media/howto-create-dataflow/dataflow-source-mqtt.png" alt-text="Screenshot of the operations experience interface showing the selection of the message broker as the source endpoint for a data flow.":::
4038

@@ -43,7 +41,7 @@ You can use one of the following options as the source.
4341
| Setting | Description |
4442
| -------------------- | ------------------------------------------------------------------------------------------------- |
4543
| Data flow endpoint | Select *default* to use the default MQTT message broker endpoint. |
46-
| Topic | The topic filter to subscribe to for incoming messages. Use **Topic(s)** > **Add row** to add multiple topics. For more information on topics, see [Configure MQTT or Kafka topics](#configure-data-sources-mqtt-or-kafka-topics). |
44+
| Topic | The topic filter to subscribe to for incoming messages. Use **Topic(s)** > **Add row** to add multiple topics. For more information on topics, see [Subscribe to multiple topics](#subscribe-to-multiple-topics). |
4745
| Message schema | The schema to use to deserialize the incoming messages. See [Specify schema to deserialize data](#specify-source-schema). |
4846

4947
1. Select **Apply**.
@@ -87,9 +85,9 @@ sourceSettings:
8785
8886
---
8987
90-
Because `dataSources` accepts MQTT or Kafka topics without modifying the endpoint configuration, you can reuse the endpoint for multiple data flows even if the topics are different. For more information, see [Configure data sources](#configure-data-sources-mqtt-or-kafka-topics).
88+
Because `dataSources` accepts MQTT or Kafka topics without modifying the endpoint configuration, you can reuse the endpoint for multiple data flows even if the topics are different. For more information, see [Subscribe to multiple topics](#subscribe-to-multiple-topics).
9189

92-
### Option 2: Use an asset as a source
90+
## Use an asset as a source
9391

9492
# [Operations experience](#tab/portal)
9593

@@ -123,13 +121,13 @@ When you use an asset as the source, the asset definition provides the schema fo
123121

124122
After you configure the source, the data from the asset reaches the data flow through the local MQTT broker. So, when you use an asset as the source, the data flow uses the local MQTT broker default endpoint as the source.
125123

126-
### Option 3: Use a custom MQTT or Kafka endpoint
124+
## Use a custom MQTT or Kafka endpoint
127125

128126
If you created a custom MQTT or Kafka data flow endpoint (for example, to use with Event Grid or Event Hubs), you can use it as the source for the data flow. Remember that storage type endpoints, like Data Lake or Fabric OneLake, can't be used as a source.
129127

130128
# [Operations experience](#tab/portal)
131129

132-
1. Under **Source details**, select **Message broker**.
130+
1. Under **Source details**, select **Data flow endpoint**.
133131

134132
:::image type="content" source="media/howto-create-dataflow/dataflow-source-custom.png" alt-text="Screenshot using operations experience to select a custom message broker as the source endpoint.":::
135133

@@ -138,7 +136,7 @@ If you created a custom MQTT or Kafka data flow endpoint (for example, to use wi
138136
| Setting | Description |
139137
| -------------------- | ------------------------------------------------------------------------------------------------- |
140138
| Data flow endpoint | Use the **Reselect** button to select a custom MQTT or Kafka data flow endpoint. For more information, see [Configure MQTT data flow endpoints](howto-configure-mqtt-endpoint.md) or [Configure Azure Event Hubs and Kafka data flow endpoints](howto-configure-kafka-endpoint.md).|
141-
| Topic | The topic filter to subscribe to for incoming messages. Use **Topic(s)** > **Add row** to add multiple topics. For more information on topics, see [Configure MQTT or Kafka topics](#configure-data-sources-mqtt-or-kafka-topics). |
139+
| Topic | The topic filter to subscribe to for incoming messages. Use **Topic(s)** > **Add row** to add multiple topics. For more information on topics, see [Subscribe to multiple topics](#subscribe-to-multiple-topics). |
142140
| Message schema | The schema to use to deserialize the incoming messages. See [Specify schema to deserialize data](#specify-source-schema). |
143141

144142
1. Select **Apply**.
@@ -188,17 +186,17 @@ sourceSettings:
188186

189187
---
190188

191-
## Configure data sources (MQTT or Kafka topics)
189+
## Subscribe to multiple topics
192190

193191
You can specify multiple MQTT or Kafka topics in a source without needing to modify the data flow endpoint configuration. This flexibility means you can reuse the same endpoint across multiple data flows, even if the topics vary. For more information, see [Reuse data flow endpoints](./howto-configure-dataflow-endpoint.md#reuse-endpoints).
194192

195-
### MQTT topics
193+
## MQTT topic wildcards
196194

197195
When the source is an MQTT (Event Grid included) endpoint, use the MQTT topic filter to subscribe to incoming messages. The topic filter can include wildcards to subscribe to multiple topics. For example, `thermostats/+/sensor/temperature/#` subscribes to all temperature sensor messages from thermostats. To configure the MQTT topic filters:
198196

199197
# [Operations experience](#tab/portal)
200198

201-
In the operations experience data flow **Source details**, select **Message broker**, then use the **Topic(s)** field to specify the MQTT topic filters to subscribe to for incoming messages. To add multiple MQTT topics, select **Add row** and enter a new topic.
199+
In the operations experience data flow **Source details**, select **Data flow endpoint**, then use the **Topic(s)** field to specify the MQTT topic filters to subscribe to for incoming messages. To add multiple MQTT topics, select **Add row** and enter a new topic.
202200

203201
:::image type="content" source="media/howto-configure-dataflow-source/dataflow-source-multiple-topics.png" alt-text="Screenshot of the operations experience interface showing multiple MQTT topic filters configured in the source details for a data flow.":::
204202

@@ -284,13 +282,13 @@ Here, the wildcard `+` selects all devices under the `thermostats` and `humidifi
284282

285283
---
286284

287-
### Shared subscriptions
285+
## Shared subscriptions
288286

289287
To use shared subscriptions with message broker sources, specify the shared subscription topic in the form of `$shared/<GROUP_NAME>/<TOPIC_FILTER>`.
290288

291289
# [Operations experience](#tab/portal)
292290

293-
In operations experience data flow **Source details**, select **Message broker** and use the **Topic** field to specify the shared subscription group and topic.
291+
In operations experience data flow **Source details**, select **Data flow endpoint** and use the **Topic** field to specify the shared subscription group and topic.
294292

295293
# [Azure CLI](#tab/cli)
296294
```json
@@ -331,7 +329,7 @@ You can explicitly create a topic named `$shared/mygroup/topic` in your configur
331329
> [!IMPORTANT]
332330
> Shared subscriptions are important for data flows when the instance count is greater than one and you're using Event Grid MQTT broker as a source, since it [doesn't support shared subscriptions](../../event-grid/mqtt-support.md#mqtt-v5-current-limitations). To avoid missing messages, set the data flow profile instance count to one when using Event Grid MQTT broker as the source. That is when the data flow is the subscriber and receiving messages from the cloud.
333331

334-
### Kafka topics
332+
## Kafka topics
335333

336334
When the source is a Kafka (Event Hubs included) endpoint, specify the individual Kafka topics to subscribe to for incoming messages. Wildcards aren't supported, so you must specify each topic statically.
337335

@@ -342,7 +340,7 @@ To configure the Kafka topics:
342340

343341
# [Operations experience](#tab/portal)
344342

345-
In the operations experience data flow **Source details**, select **Message broker**, then use the **Topic** field to specify the Kafka topic filter to subscribe to for incoming messages.
343+
In the operations experience data flow **Source details**, select **Data flow endpoint**, then use the **Topic** field to specify the Kafka topic filter to subscribe to for incoming messages.
346344

347345
> [!NOTE]
348346
> You can specify only one topic filter in the operations experience. To use multiple topic filters, use Bicep or Kubernetes.
@@ -386,6 +384,14 @@ sourceSettings:
386384

387385
---
388386

387+
## Use the source topic in the destination path
388+
389+
When you subscribe to multiple topics with wildcards, you can use the source topic as a variable in the destination path. This feature works with both data flows and data flow graphs.
390+
391+
Use `${inputTopic}` for the full source topic, or `${inputTopic.N}` to extract a specific segment (1-indexed). For example, if you subscribe to `factory/+/telemetry/#`, a message arriving on `factory/line1/telemetry/temp` can be routed to a destination topic like `processed/${inputTopic.2}/data`, which resolves to `processed/line1/data`.
392+
393+
For full details and examples, see [Dynamic destination topics](howto-configure-dataflow-destination.md#dynamic-destination-topics).
394+
389395
## Specify source schema
390396

391397
When you use MQTT or Kafka as the source, you can specify a [schema](concept-schema-registry.md) to display the list of data points in the operations experience web UI. Using a schema to deserialize and validate incoming messages [isn't currently supported](../troubleshoot/known-issues.md#data-flows-issues).
@@ -399,7 +405,7 @@ To configure the schema used to deserialize the incoming messages from a source:
399405

400406
# [Operations experience](#tab/portal)
401407

402-
In the Operations experience data flow **Source details**, select **Message broker** and use the **Message schema** field to specify the schema. Select **Upload** to upload a schema file. For more information, see [Understand message schemas](concept-schema-registry.md).
408+
In the Operations experience data flow **Source details**, select **Data flow endpoint** and use the **Message schema** field to specify the schema. Select **Upload** to upload a schema file. For more information, see [Understand message schemas](concept-schema-registry.md).
403409

404410
# [Azure CLI](#tab/cli)
405411

0 commit comments

Comments
 (0)