You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: articles/iot-operations/discover-manage-assets/howto-connect-kafka.md
+10-10Lines changed: 10 additions & 10 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -1,30 +1,30 @@
1
1
---
2
2
title: Connect to a Kafka source
3
-
description: Use a data flow and the connector for MQTT (preview) to ingest data from a Kafka-compatible source such as Azure Event Hubs, discover topics as assets, and route data through the MQTT broker in Azure IoT Operations.
3
+
description: Use a data flow and the connector for MQTT to ingest data from a Kafka-compatible source such as Azure Event Hubs, discover topics as assets, and route data through the MQTT broker in Azure IoT Operations.
4
4
author: dominicbetts
5
5
ms.author: dobett
6
6
ms.service: azure-iot-operations
7
7
ms.topic: how-to
8
-
ms.date: 02/23/2026
8
+
ms.date: 03/06/2026
9
9
10
10
#CustomerIntent: As an industrial edge IT or operations user, I want to ingest data from an Azure Event Hubs namespace into Azure IoT Operations using the Kafka protocol so that I can manage event hubs as assets and route the data through the MQTT broker for processing.
11
11
---
12
12
13
13
# Connect to a Kafka source
14
14
15
-
Many messaging services expose a Kafka-compatible endpoint, including Azure Event Hubs, Confluent Cloud, and self-hosted Apache Kafka clusters. Azure IoT Operations doesn't include a dedicated southbound Kafka connector, but you can ingest data from any Kafka-compatible source by using a data flow and the connector for MQTT (preview). For simplicity, this article uses an Azure Event Hubs namespace as the Kafka source.
15
+
Many messaging services expose a Kafka-compatible endpoint, including Azure Event Hubs, Confluent Cloud, and self-hosted Apache Kafka clusters. Azure IoT Operations doesn't include a dedicated southbound Kafka connector, but you can ingest data from any Kafka-compatible source by using a data flow and the connector for MQTT. For simplicity, this article uses an Azure Event Hubs namespace as the Kafka source.
16
16
17
17
To connect to a Kafka source, you combine:
18
18
19
19
- A **data flow** with a Kafka inbound endpoint that connects to the Event Hubs namespace and routes messages to the internal MQTT broker. You can configure the data flow to use the Event Hubs topic name as the destination topic in the MQTT broker. This mapping lets you route messages from multiple event hubs to corresponding MQTT broker topics without a separate configuration for each one.
20
20
21
-
- The **connector for MQTT (preview)** configured with a device that uses discovery to find the MQTT broker topics that correspond to your event hubs. The connector creates assets for each discovered topic. You can then configure each asset to route data to your own named topics in the MQTT broker, and apply any individual data processing you need as the data flows out of the system.
21
+
- The **connector for MQTT** configured with a device that uses discovery to find the MQTT broker topics that correspond to your event hubs. The connector creates assets for each discovered topic. You can then configure each asset to route data to your own named topics in the MQTT broker, and apply any individual data processing you need as the data flows out of the system.
22
22
23
23
The following diagram illustrates this architecture:
24
24
25
25
:::image type="content" source="media/howto-connect-kafka/kafka.svg" alt-text="Diagram that shows the architecture of the solution." lightbox="media/howto-connect-kafka/kafka.png":::
26
26
27
-
Messages enter from the Kafka source, such as an Azure Event Hubs namespace, and are ingested into Azure IoT Operations through a data flow with a Kafka source endpoint. The data flow routes messages to topics in the internal MQTT broker. The connector for MQTT (preview) detects the topics in the MQTT broker and lets you create assets based on the topic names. Each asset can be configured to route data to specific topics in the MQTT broker. You can then perform any routing and custom processing to the messages.
27
+
Messages enter from the Kafka source, such as an Azure Event Hubs namespace, and are ingested into Azure IoT Operations through a data flow with a Kafka source endpoint. The data flow routes messages to topics in the internal MQTT broker. The connector for MQTT detects the topics in the MQTT broker and lets you create assets based on the topic names. Each asset can be configured to route data to specific topics in the MQTT broker. You can then perform any routing and custom processing to the messages.
28
28
29
29
30
30
> [!TIP]
@@ -92,9 +92,9 @@ Create a data flow that reads from your Event Hubs namespace using the Kafka pro
92
92
93
93
Once the data flow is running, messages from your Event Hubs topics appear in the MQTT broker under the `factory/` topic prefix.
94
94
95
-
## Step 3: Configure the connector for MQTT (preview) device
95
+
## Step 3: Configure the connector for MQTT device
96
96
97
-
Set up a device in the connector for MQTT (preview) that subscribes to the MQTT broker topics where the Kafka data lands. Configure the device to use topic discovery so that each Kafka topic appears as an asset.
97
+
Set up a device in the connector for MQTT that subscribes to the MQTT broker topics where the Kafka data lands. Configure the device to use topic discovery so that each Kafka topic appears as an asset.
98
98
99
99
1. In the operations experience web UI, select **Devices** in the left navigation pane. Then select **Create new**.
100
100
@@ -120,7 +120,7 @@ Set up a device in the connector for MQTT (preview) that subscribes to the MQTT
120
120
121
121
## Step 4: Send test messages using Data Explorer
122
122
123
-
Before you can discover assets, messages must arrive at the MQTT broker so the connector for MQTT (preview) can detect the topics. You can use the **Data Explorer** feature built into each event hub in the Azure portal to send test messages.
123
+
Before you can discover assets, messages must arrive at the MQTT broker so the connector for MQTT can detect the topics. You can use the **Data Explorer** feature built into each event hub in the Azure portal to send test messages.
124
124
125
125
1. In the [Azure portal](https://portal.azure.com), navigate to your Event Hubs namespace, then select the event hub you want to test, such as `warehouse-w1-machine-m1`.
126
126
@@ -138,7 +138,7 @@ Before you can discover assets, messages must arrive at the MQTT broker so the c
138
138
139
139
## Step 5: Discover and create assets
140
140
141
-
When the data flow forwards Event Hubs messages to the MQTT broker, the connector for MQTT (preview) detects the topic paths and creates a _discovered_ asset_ for each one. You can then import each discovered asset to manage it individually.
141
+
When the data flow forwards Event Hubs messages to the MQTT broker, the connector for MQTT detects the topic paths and creates a _discovered_ asset_ for each one. You can then import each discovered asset to manage it individually.
142
142
143
143
1. In the operations experience web UI, select **Discovery** in the left navigation pane. Discovered assets from the Event Hubs topics appear in the list with the asset name derived from the topic path:
144
144
@@ -172,4 +172,4 @@ To verify that the asset is routing messages correctly, you can use an MQTT clie
172
172
173
173
- [Configure Azure Event Hubs and Kafka data flow endpoints](../connect-to-cloud/howto-configure-kafka-endpoint.md)
174
174
- [Use Azure Event Hubs from Apache Kafka applications](/azure/event-hubs/event-hubs-for-kafka-ecosystem-overview)
175
-
- [Configure the connector for MQTT (preview)](howto-use-mqtt-connector.md)
175
+
- [Configure the connector for MQTT](howto-use-mqtt-connector.md)
To transform the incoming data by using a WASM module and graph, complete the following steps:
279
-
280
-
1. Develop a WASM module to perform the custom transformation. For more information, see [Develop WebAssembly (WASM) modules and graph definitions](../develop-edge-apps/howto-develop-wasm-modules.md) or [Build WASM modules for data flows in VS Code](../develop-edge-apps/howto-build-wasm-modules-vscode.md).
281
-
282
-
1. Configure your transformation graph. For more information, see [Configure WebAssembly (WASM) graph definitions](../develop-edge-apps/howto-configure-wasm-graph-definitions.md).
283
-
284
-
1. Deploy both the module and graph to your container registry. For more information, see [Deploy WebAssembly (WASM) modules and graph definitions](../develop-edge-apps/howto-deploy-wasm-graph-definitions.md).
285
-
286
-
1. Set up authentication and connection details so Azure IoT Operations can access the container registry.
287
-
288
-
1. Configure your asset's dataset with the URL of the deployed WASM graph in the **Transform** field:
289
-
290
-
:::image type="content" source="media/howto-use-http-connector/configure-transform.png" alt-text="Screenshot that shows how to add a WASM transform to a dataset." lightbox="media/howto-use-http-connector/configure-transform.png":::
291
-
292
-
A data transformation in the HTTP/REST connector only requires a [single map operator](../develop-edge-apps/howto-develop-wasm-modules.md#quickstart-build-deploy-and-verify-a-wasm-module), but WASM graphs are fully supported with the following restrictions:
293
-
294
-
- The graph must have a single `source` node and a single `sink` node.
295
-
- The graph must consume and emit the `DataModel::Message` datatype.
296
-
- The graph must be stateless. Currently, this restriction means that accumulate operators aren't supported.
0 commit comments