You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: articles/partner-solutions/apache-kafka-confluent-cloud/add-confluent-connectors.md
+27-45Lines changed: 27 additions & 45 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -1,22 +1,19 @@
1
1
---
2
-
title: Use Confluent Connectors in Azure (Preview)
2
+
title: Create a Confluent Connector for Azure Blob Storage (Preview)
3
3
description: Learn how to use Confluent Connectors in Azure (preview) to connect an instance of Apache Kafka & Apache Flink on Confluent Cloud to Azure Blob Storage.
4
4
ms.topic: how-to
5
-
ms.date: 05/28/2024
5
+
ms.date: 10/30/2025
6
6
ms.author: malev
7
7
author: maud-lv
8
8
9
-
#customer intent: As a developer, I want learn how to connect an instance of Apache Kafka & Apache Flink on Confluent Cloud to Azure Blob Storage so that I can use Confluent Connectors in Azure.
9
+
#customer intent: As a developer, I want to learn how to connect an instance of Apache Kafka & Apache Flink on Confluent Cloud to Azure Blob Storage so that I can use Confluent Connectors in Azure.
10
10
---
11
11
12
-
# Use Confluent Connectors in Azure (preview)
12
+
# Create a Confluent Connector to Azure Blob Storage (preview)
13
13
14
-
Confluent Cloud helps you connect your Confluent clusters to popular data sources and sinks. The solution is available on Azure by using the Confluent Connectors feature.
14
+
Confluent Cloud helps you connect your Confluent clusters to popular data sources and sinks. You can take advantage of this solution on Azure by using the Confluent Connectors feature.
15
15
16
-
> [!NOTE]
17
-
> Currently, Apache Kafka & Apache Flink on Confluent Cloud, an Azure Native Integrations service, supports only Confluent Connectors for Azure Blob Storage. It supports both source and sink connectors in Azure Blob Storage.
18
-
19
-
In this article, learn how to connect an instance of Apache Kafka & Apache Flink on Confluent Cloud to Azure Blob Storage.
16
+
In this article, you'll learn how to connect an instance of Apache Kafka & Apache Flink on Confluent Cloud to Azure Blob Storage.
20
17
21
18
## Prerequisites
22
19
@@ -36,7 +33,7 @@ To create a sink connector for Azure Blob Storage:
36
33
:::image type="content" source="./media/confluent-connectors/create-new-connector.png" alt-text="Screenshot that shows the Confluent Connectors menu in the Azure portal.":::
37
34
38
35
1. Select **Create new connector**.
39
-
1.On the **Create a new connector** pane, configure the settings that are described in the next sections.
36
+
1.In the **Create a new connector** pane, configure the settings that are described in the next sections.
40
37
41
38
### Basics
42
39
@@ -56,13 +53,14 @@ On the **Basics** tab, enter or select values for the following settings:
56
53
57
54
:::image type="content" source="./media/confluent-connectors/basic-sink.png" alt-text="Screenshot that shows the Basics tab and creating a sink connector in the Azure portal.":::
58
55
59
-
Then, select**Next**.
56
+
Select**Next**.
60
57
61
58
### Authentication
62
59
63
-
On the **Authentication** tab, you can configure the authentication of your Kafka cluster via API keys. By default, **Create New**is selected and API keys are automatically generated and configured when the connector is created.
60
+
On the **Authentication** tab, select an authentication method: **User**or **Service account**.
64
61
65
-
Leave the default values and select the **Configuration** tab.
62
+
- To use a service account (recommended for production), enter a **Service account** name and continue. A new service account will be provisioned in Confluent cloud when the connector is created.
63
+
- To use a user account, leave **User** selected and continue. A user API key and secret will be created for the specific user in Confluent cloud when the connector is created.
66
64
67
65
:::image type="content" source="./media/confluent-connectors/authentication.png" alt-text="Screenshot that shows the Authentication tab in the Azure portal.":::
68
66
@@ -72,30 +70,30 @@ On the **Configuration** tab, enter or select the following values, and then sel
72
70
73
71
| Setting | Action |
74
72
| --- | --- |
75
-
|**Input Data Format**| Select an input Kafka record data format type: AVRO, JSON, string, or Protobuf. |
76
-
|**Output Data Format**| Select an output data format: AVRO, JSON, string, or Protobuf. |
73
+
|**Input Data Format**| Select an input Kafka record data format type: **AVRO**, **JSON**, **string**, or **Protobuf**. |
74
+
|**Output Data Format**| Select an output data format: **AVRO**, **JSON**, **string**, or **Protobuf**. |
77
75
|**Time Interval**| Select the time interval in which to group the data. Choose between hourly and daily. |
78
76
|**Flush size**| Optionally, you can enter a flush size. The default flush size is 1,000. |
79
-
|**Number of tasks**| Optionally, you can enter the maximum number of simultaneous tasks you want your connector to support. The default is 1. |
77
+
|**Number of tasks**| Optionally, you can enter the maximum number of simultaneous tasks you want your connector to support. The default is **1**. |
80
78
81
79
:::image type="content" source="./media/confluent-connectors/configuration-sink.png" alt-text="Screenshot that shows the Configuration tab for a sink connector in the Azure portal.":::
82
80
83
81
Select **Review + create** to continue.
84
82
85
83
### Review + create
86
84
87
-
Review your settings for the connector to ensure that the details are accurate and complete. Then, select **Create** to begin the connector deployment.
85
+
Review your settings for the connector to ensure that the details are accurate and complete. Then select **Create** to begin the connector deployment.
88
86
89
-
In the upper-right corner of the Azure portal, a notification displays the deployment status. When it shows the status *Completed*, refresh the **Confluent Connectors (Preview)** pane and check for the new connector tile on this pane.
87
+
In the upper-right corner of the Azure portal, a notification displays the deployment status. When the status is **Completed**, refresh the **Confluent Connectors (Preview)** pane and check for the new connector tile on this pane.
90
88
91
89
## Create a Confluent source Connector for Azure Blob Storage (preview)
92
90
93
91
1. In the Azure portal, go to your Confluent organization.
94
-
1.On the left menu, select **Confluent** > **Confluent Connectors (Preview)**.
92
+
1.In the left pane, select **Confluent** > **Confluent Connectors (Preview)**.
95
93
96
94
:::image type="content" source="./media/confluent-connectors/create-new-connector.png" alt-text="Screenshot that shows the Confluent Connectors menu in the Azure portal.":::
97
95
98
-
1.On the **Create a new connector** pane, select **Create new connector**.
96
+
1.In the **Create a new connector** pane, select **Create new connector**.
99
97
100
98
### Basics
101
99
@@ -114,13 +112,14 @@ On the **Basics** tab, enter or select values for the following settings:
114
112
115
113
:::image type="content" source="./media/confluent-connectors/basic-source.png" alt-text="Screenshot that shows the Basics tab and creating a source connector in the Azure portal.":::
116
114
117
-
Then, select**Next**.
115
+
Select**Next**.
118
116
119
117
### Authentication
120
118
121
-
On the **Authentication** tab, you can configure the authentication of your Kafka cluster via API keys. By default, **Create New**is selected and API keys are automatically generated and configured when the connector is created.
119
+
On the **Authentication** tab, select an authentication method: **User**or **Service account**.
122
120
123
-
Leave the default values and select the **Configuration** tab.
121
+
- To use a service account (recommended for production), enter a **Service account** name and continue. A new service account will be provisioned in Confluent cloud when the connector is created.
122
+
- To use a user account, leave **User** selected and continue. A user API key and secret will be created for the specific user in Confluent cloud when the connector is created.
124
123
125
124
:::image type="content" source="./media/confluent-connectors/authentication.png" alt-text="Screenshot that shows the Authentication tab in the Azure portal.":::
126
125
@@ -130,38 +129,21 @@ On the **Configuration** tab, enter or select values for the following settings:
130
129
131
130
| Name | Action |
132
131
| --- | --- |
133
-
|**Input Data Format**| Select an input Kafka record data format type: AVRO, JSON, string, Protobuf. |
134
-
|**Output Data Format**| Select an output data format: AVRO, JSON, string, or Protobuf. |
132
+
|**Input Data Format**| Select an input Kafka record data format type: **AVRO**, **JSON**, **string**, **Protobuf**. |
133
+
|**Output Data Format**| Select an output data format: **AVRO**, **JSON**, **string**, or **Protobuf**. |
135
134
|**Topic name and regex**| Configure the topic name and the regex pattern of your messages to ensure they're mapped. For example, `*my-topic:.*\.json+` moves all the files that have the `.json` extension into `my-topic`. |
136
135
|**Flush size**| (Optional) Enter a flush size. The default flush size is 1,000. |
137
-
|**Number of tasks**| (Optional) Enter the maximum number of simultaneous tasks you want your connector to support. The default is 1. |
136
+
|**Number of tasks**| (Optional) Enter the maximum number of simultaneous tasks you want your connector to support. The default is **1**. |
138
137
139
138
:::image type="content" source="./media/confluent-connectors/configuration-source.png" alt-text="Screenshot that shows the Configuration tab and creating a source connector in the Azure portal.":::
140
139
141
140
Select **Review + create** to continue.
142
141
143
142
### Review + create
144
143
145
-
Review your settings for the connector to ensure that the details are accurate and complete. Then, select **Create** to begin the connector deployment.
146
-
147
-
In the upper-right corner of the Azure portal, a notification displays the deployment status. When it shows the status *Completed*, refresh the **Confluent Connectors (Preview)** pane and check for the new connector tile on this pane.
148
-
149
-
## Manage Azure Confluent Connectors (preview)
150
-
151
-
1. In the Azure portal, go to your Confluent organization.
152
-
1. On the left menu, select **Confluent** > **Confluent Connectors**.
153
-
1. Select your environment and cluster.
154
-
155
-
The Azure portal shows a list of Azure connectors for the environment and cluster.
156
-
157
-
You can also complete the following optional actions:
158
-
159
-
* Filter connectors by **Type** (**Source** or **Sink**) and **Status** (**Running**, **Failed**, **Provisioning**, or **Paused**).
160
-
* Search for a connector by name.
161
-
162
-
:::image type="content" source="./media/confluent-connectors/display-connectors.png" alt-text="Screenshot that shows a list of existing connectors on the Confluent Connectors tab in the Azure portal." lightbox="./media/confluent-connectors/display-connectors.png":::
144
+
Review your settings for the connector to ensure that the details are accurate and complete. Then select **Create** to begin the connector deployment.
163
145
164
-
To learn more about a connector, select the connector tile to open Confluent. In the Confluent UI, you can see the connector health, throughput, and other information. You also can edit and delete the connector.
146
+
In the upper-right corner of the Azure portal, a notification displays the deployment status. When the status is **Completed**, refresh the **Confluent Connectors (Preview)** pane and check for the new connector tile on this pane.
title: Create a Confluent Connector for Azure Cosmos DB (Preview)
3
+
description: Learn how to use Confluent Connectors in Azure (preview) to connect an instance of Apache Kafka & Apache Flink on Confluent Cloud to Azure Cosmos DB.
4
+
ms.topic: how-to
5
+
ms.date: 10/30/2025
6
+
ms.author: malev
7
+
author: maud-lv
8
+
9
+
#customer intent: As a developer, I want to learn how to connect an instance of Apache Kafka & Apache Flink on Confluent Cloud to Azure Cosmos DB so that I can use Confluent Connectors in Azure.
10
+
---
11
+
12
+
# Create a Confluent Connector to Azure Cosmos DB (preview)
13
+
14
+
Confluent Cloud helps you connect your Confluent clusters to popular data sources and sinks. You can take advantage of this solution on Azure by using the Confluent Connectors feature.
15
+
16
+
In this article, you'll learn how to connect an instance of Apache Kafka & Apache Flink on Confluent Cloud to Azure Cosmos DB.
17
+
18
+
## Prerequisites
19
+
20
+
* An Azure account with an active subscription. [Create an account for free](https://azure.microsoft.com/free).
21
+
* An [Azure Cosmos DB](/azure/cosmos-db/) resource.
22
+
* A [Confluent organization](./create.md) created in Azure Native Integrations.
23
+
* The Owner or Contributor role for the Azure subscription. You might need to ask your subscription administrator to assign you one of these roles.
24
+
* A [configured environment, cluster, and topic](https://docs.confluent.io/cloud/current/get-started/index.html) inside the Confluent organization. If you don't have one already, go to Confluent to create these components.
25
+
26
+
## Create a Confluent sink connector for Azure Cosmos DB (preview)
27
+
28
+
To create a sink connector for Azure Cosmos DB:
29
+
30
+
1. In the Azure portal, go to your Confluent organization.
31
+
1. In the left pane, select **Data streaming** > **Confluent Connectors (Preview)**.
32
+
1. Select **Create new connector**.
33
+
34
+
:::image type="content" source="./media/add-cosmos-db-connector/create-connector.png" alt-text="Screenshot that shows the steps for creating a connector." lightbox="./media/add-cosmos-db-connector/create-connector.png":::
35
+
36
+
1. In the **Create a new connector** pane, configure the settings that are described in the next sections.
37
+
38
+
### Basics
39
+
40
+
On the **Basics** tab, enter or select values for the following settings:
41
+
42
+
|Name|Action|
43
+
|-|-|
44
+
|**Connector Type**| Select **Sink**.|
45
+
|**Connector Plugin**|Select **Azure Cosmos DB V2**.|
46
+
|**Connector Name**|Enter a name for your connector. For example, *cosmos-sink-connector*. |
47
+
|**Environment**|Select the environment where you want to create the connector.|
48
+
|**Cluster**|Select the cluster where you want to create the connector.|
49
+
|**Topics**|Select one or more Kafka topics to pull data from. |
50
+
|**Cosmos DB Account**|Select the destination Azure Cosmos DB account in your Azure tenant. |
51
+
|**Cosmos DB database**|Select the destination Azure Cosmos DB database under the account. |
52
+
53
+
:::image type="content" source="./media/add-cosmos-db-connector/create-connector-settings.png" alt-text="Screenshot that shows the settings for creating a connector." lightbox="./media/add-cosmos-db-connector/create-connector-settings.png":::
54
+
55
+
### Authentication
56
+
57
+
On the **Authentication** tab, select an authentication method: **User** or **Service account**.
58
+
59
+
- To use a service account (recommended for production), enter a **Service account** name and continue. A new service account will be provisioned in Confluent cloud when the connector is created.
60
+
- To use a user account, leave **User** selected and continue. A user API key and secret will be created for the specific user in Confluent cloud when the connector is created.
61
+
62
+
:::image type="content" source="./media/add-cosmos-db-connector/authentication-tab.png" alt-text="Screenshot that shows the Authentication tab." lightbox="./media/add-cosmos-db-connector/authentication-tab.png":::
63
+
64
+
### Configuration
65
+
66
+
On the **Configuration** tab, enter or select the following values, and then select **Next**.
67
+
68
+
|Setting|Action|
69
+
|-|-|
70
+
|**Input Data Format**|Select an input Kafka record data format type: **AVRO**, **JSON**, **string**, or **Protobuf**.|
71
+
|**Id Strategy**|Select the ID strategy used to derive the Azure Cosmos DB item ID. |
72
+
|**Cosmos DB Write Configuration**|Select the write behavior for Azure Cosmos DB items. |
73
+
|**Topic container map**|Map Kafka topics to Azure Cosmos DB containers. Use the format `topic1#container1,topic2#container2...`.|
74
+
|**Number of tasks**|(Optional) Enter the maximum number of simultaneous tasks you want your connector to support. The default is **1**. |
75
+
76
+
:::image type="content" source="./media/add-cosmos-db-connector/configuration-tab.png" alt-text="Screenshot that shows the Configuration tab." lightbox="./media/add-cosmos-db-connector/configuration-tab.png":::
77
+
78
+
For more information, see [Azure Cosmos DB Sink V2 Connector for Confluent Cloud](https://docs.confluent.io/cloud/current/connectors/cc-azure-cosmos-sink-v2.html?utm_source=chatgpt.com).
79
+
80
+
Select **Review + create** to continue.
81
+
82
+
### Review + create
83
+
84
+
Review your settings for the connector to ensure that the details are accurate and complete. Then select **Create** to begin the connector deployment. In the upper-right corner of the Azure portal, a notification displays the deployment status. When it shows that the connector is created, refresh the **Confluent Connectors (Preview)** pane and check for the new connector tile.
85
+
86
+
## Create a Confluent source connector for Azure Cosmos DB (preview)
87
+
88
+
1. In the Azure portal, go to your Confluent organization.
89
+
1. In the left pane, select **Data Streaming** > **Confluent Connectors (Preview)**.
90
+
1. Select **Create new connector**.
91
+
1. In the **Create a new connector** pane, configure the settings that are described in the following sections.
92
+
93
+
### Basics
94
+
95
+
On the **Basics** tab, enter or select values for the following settings:
96
+
97
+
|Setting|Action|
98
+
|-|-|
99
+
|**Connector Type**|Select **Source**.|
100
+
|**Connector Class**|Select **Azure Cosmos DB V2**.|
101
+
|**Connector Name**|Enter a name for your connector. For example, *cosmos-source-connector*.|
102
+
|**Environment**|Select the environment where you want to create the connector. |
103
+
|**Cluster**|Select the cluster where you want to create the connector.|
104
+
|**Cosmos DB Account**|Select the source Azure Cosmos DB account. |
105
+
|**Cosmos DB database**|Select the source Azure Cosmos DB database.|
106
+
107
+
:::image type="content" source="./media/add-cosmos-db-connector/source-basics-tab.png" alt-text="Screenshot that shows the Basics tab for creating a source connector." lightbox="./media/add-cosmos-db-connector/source-basics-tab.png":::
108
+
109
+
### Authentication
110
+
111
+
On the **Authentication** tab, select an authentication method: **User** or **Service account**.
112
+
113
+
- To use a service account (recommended for production), enter a **Service account** name and continue. A new service account will be provisioned in Confluent cloud when the connector is created.
114
+
- To use a user account, leave **User** selected and continue. A user API key and secret will be created for the specific user in Confluent cloud when the connector is created.
115
+
116
+
:::image type="content" source="./media/add-cosmos-db-connector/source-authentication-tab.png" alt-text="Screenshot that shows the Authentication tab for creating a source connector." lightbox="./media/add-cosmos-db-connector/source-authentication-tab.png":::
117
+
118
+
### Configuration
119
+
120
+
On the **Configuration** tab, enter or select the following values, and then select **Next**.
121
+
122
+
|Name|Action|
123
+
|-|-|
124
+
|**Output Data Format**|Select an output Kafka record data format type: **AVRO**, **JSON**, **string**, or **Protobuf**.|
125
+
|**Container topic map**|Map Azure Cosmos DB containers to Kafka topics. Use the format `container1#topic1,container2:topic2…`. |
126
+
|**Number of tasks**|(Optional) Enter the maximum number of simultaneous tasks you want your connector to support. The default is **1**. |
127
+
128
+
:::image type="content" source="./media/add-cosmos-db-connector/source-configuration-tab.png" alt-text="Screenshot that shows the Configuration tab for creating a source connector." lightbox="./media/add-cosmos-db-connector/source-configuration-tab.png":::
129
+
130
+
Select **Review + create** to continue.
131
+
132
+
### Review + create
133
+
134
+
Review your settings for the connector to ensure that the details are accurate and complete. Then select **Create** to begin the connector deployment. In the upper-right corner of the Azure portal, a notification displays the deployment status. When it shows that the connector is created, refresh the **Confluent Connectors (Preview)** pane and check for the new connector tile.
0 commit comments