Skip to content

Commit 45adba7

Browse files
authored
Merge pull request #306768 from v-albemi/confluent-cosmos
Azure Native Integrations - Confluent
2 parents 2c6b0f5 + de47770 commit 45adba7

12 files changed

Lines changed: 198 additions & 46 deletions

articles/partner-solutions/apache-kafka-confluent-cloud/add-confluent-connectors.md

Lines changed: 27 additions & 45 deletions
Original file line numberDiff line numberDiff line change
@@ -1,22 +1,19 @@
11
---
2-
title: Use Confluent Connectors in Azure (Preview)
2+
title: Create a Confluent Connector for Azure Blob Storage (Preview)
33
description: Learn how to use Confluent Connectors in Azure (preview) to connect an instance of Apache Kafka & Apache Flink on Confluent Cloud to Azure Blob Storage.
44
ms.topic: how-to
5-
ms.date: 05/28/2024
5+
ms.date: 10/30/2025
66
ms.author: malev
77
author: maud-lv
88

9-
#customer intent: As a developer, I want learn how to connect an instance of Apache Kafka & Apache Flink on Confluent Cloud to Azure Blob Storage so that I can use Confluent Connectors in Azure.
9+
#customer intent: As a developer, I want to learn how to connect an instance of Apache Kafka & Apache Flink on Confluent Cloud to Azure Blob Storage so that I can use Confluent Connectors in Azure.
1010
---
1111

12-
# Use Confluent Connectors in Azure (preview)
12+
# Create a Confluent Connector to Azure Blob Storage (preview)
1313

14-
Confluent Cloud helps you connect your Confluent clusters to popular data sources and sinks. The solution is available on Azure by using the Confluent Connectors feature.
14+
Confluent Cloud helps you connect your Confluent clusters to popular data sources and sinks. You can take advantage of this solution on Azure by using the Confluent Connectors feature.
1515

16-
> [!NOTE]
17-
> Currently, Apache Kafka & Apache Flink on Confluent Cloud, an Azure Native Integrations service, supports only Confluent Connectors for Azure Blob Storage. It supports both source and sink connectors in Azure Blob Storage.
18-
19-
In this article, learn how to connect an instance of Apache Kafka & Apache Flink on Confluent Cloud to Azure Blob Storage.
16+
In this article, you'll learn how to connect an instance of Apache Kafka & Apache Flink on Confluent Cloud to Azure Blob Storage.
2017

2118
## Prerequisites
2219

@@ -36,7 +33,7 @@ To create a sink connector for Azure Blob Storage:
3633
:::image type="content" source="./media/confluent-connectors/create-new-connector.png" alt-text="Screenshot that shows the Confluent Connectors menu in the Azure portal.":::
3734

3835
1. Select **Create new connector**.
39-
1. On the **Create a new connector** pane, configure the settings that are described in the next sections.
36+
1. In the **Create a new connector** pane, configure the settings that are described in the next sections.
4037

4138
### Basics
4239

@@ -56,13 +53,14 @@ On the **Basics** tab, enter or select values for the following settings:
5653

5754
:::image type="content" source="./media/confluent-connectors/basic-sink.png" alt-text="Screenshot that shows the Basics tab and creating a sink connector in the Azure portal.":::
5855

59-
Then, select **Next**.
56+
Select **Next**.
6057

6158
### Authentication
6259

63-
On the **Authentication** tab, you can configure the authentication of your Kafka cluster via API keys. By default, **Create New** is selected and API keys are automatically generated and configured when the connector is created.
60+
On the **Authentication** tab, select an authentication method: **User** or **Service account**.
6461

65-
Leave the default values and select the **Configuration** tab.
62+
- To use a service account (recommended for production), enter a **Service account** name and continue. A new service account will be provisioned in Confluent cloud when the connector is created.
63+
- To use a user account, leave **User** selected and continue. A user API key and secret will be created for the specific user in Confluent cloud when the connector is created.
6664

6765
:::image type="content" source="./media/confluent-connectors/authentication.png" alt-text="Screenshot that shows the Authentication tab in the Azure portal.":::
6866

@@ -72,30 +70,30 @@ On the **Configuration** tab, enter or select the following values, and then sel
7270

7371
| Setting | Action |
7472
| --- | --- |
75-
| **Input Data Format** | Select an input Kafka record data format type: AVRO, JSON, string, or Protobuf. |
76-
| **Output Data Format** | Select an output data format: AVRO, JSON, string, or Protobuf. |
73+
| **Input Data Format** | Select an input Kafka record data format type: **AVRO**, **JSON**, **string**, or **Protobuf**. |
74+
| **Output Data Format** | Select an output data format: **AVRO**, **JSON**, **string**, or **Protobuf**. |
7775
| **Time Interval** | Select the time interval in which to group the data. Choose between hourly and daily. |
7876
| **Flush size** | Optionally, you can enter a flush size. The default flush size is 1,000. |
79-
| **Number of tasks** | Optionally, you can enter the maximum number of simultaneous tasks you want your connector to support. The default is 1. |
77+
| **Number of tasks** | Optionally, you can enter the maximum number of simultaneous tasks you want your connector to support. The default is **1**. |
8078

8179
:::image type="content" source="./media/confluent-connectors/configuration-sink.png" alt-text="Screenshot that shows the Configuration tab for a sink connector in the Azure portal.":::
8280

8381
Select **Review + create** to continue.
8482

8583
### Review + create
8684

87-
Review your settings for the connector to ensure that the details are accurate and complete. Then, select **Create** to begin the connector deployment.
85+
Review your settings for the connector to ensure that the details are accurate and complete. Then select **Create** to begin the connector deployment.
8886

89-
In the upper-right corner of the Azure portal, a notification displays the deployment status. When it shows the status *Completed*, refresh the **Confluent Connectors (Preview)** pane and check for the new connector tile on this pane.
87+
In the upper-right corner of the Azure portal, a notification displays the deployment status. When the status is **Completed**, refresh the **Confluent Connectors (Preview)** pane and check for the new connector tile on this pane.
9088

9189
## Create a Confluent source Connector for Azure Blob Storage (preview)
9290

9391
1. In the Azure portal, go to your Confluent organization.
94-
1. On the left menu, select **Confluent** > **Confluent Connectors (Preview)**.
92+
1. In the left pane, select **Confluent** > **Confluent Connectors (Preview)**.
9593

9694
:::image type="content" source="./media/confluent-connectors/create-new-connector.png" alt-text="Screenshot that shows the Confluent Connectors menu in the Azure portal.":::
9795

98-
1. On the **Create a new connector** pane, select **Create new connector**.
96+
1. In the **Create a new connector** pane, select **Create new connector**.
9997

10098
### Basics
10199

@@ -114,13 +112,14 @@ On the **Basics** tab, enter or select values for the following settings:
114112

115113
:::image type="content" source="./media/confluent-connectors/basic-source.png" alt-text="Screenshot that shows the Basics tab and creating a source connector in the Azure portal.":::
116114

117-
Then, select **Next**.
115+
Select **Next**.
118116

119117
### Authentication
120118

121-
On the **Authentication** tab, you can configure the authentication of your Kafka cluster via API keys. By default, **Create New** is selected and API keys are automatically generated and configured when the connector is created.
119+
On the **Authentication** tab, select an authentication method: **User** or **Service account**.
122120

123-
Leave the default values and select the **Configuration** tab.
121+
- To use a service account (recommended for production), enter a **Service account** name and continue. A new service account will be provisioned in Confluent cloud when the connector is created.
122+
- To use a user account, leave **User** selected and continue. A user API key and secret will be created for the specific user in Confluent cloud when the connector is created.
124123

125124
:::image type="content" source="./media/confluent-connectors/authentication.png" alt-text="Screenshot that shows the Authentication tab in the Azure portal.":::
126125

@@ -130,38 +129,21 @@ On the **Configuration** tab, enter or select values for the following settings:
130129

131130
| Name | Action |
132131
| --- | --- |
133-
| **Input Data Format** | Select an input Kafka record data format type: AVRO, JSON, string, Protobuf. |
134-
| **Output Data Format** | Select an output data format: AVRO, JSON, string, or Protobuf. |
132+
| **Input Data Format** | Select an input Kafka record data format type: **AVRO**, **JSON**, **string**, **Protobuf**. |
133+
| **Output Data Format** | Select an output data format: **AVRO**, **JSON**, **string**, or **Protobuf**. |
135134
| **Topic name and regex** | Configure the topic name and the regex pattern of your messages to ensure they're mapped. For example, `*my-topic:.*\.json+` moves all the files that have the `.json` extension into `my-topic`. |
136135
| **Flush size** | (Optional) Enter a flush size. The default flush size is 1,000. |
137-
| **Number of tasks** | (Optional) Enter the maximum number of simultaneous tasks you want your connector to support. The default is 1. |
136+
| **Number of tasks** | (Optional) Enter the maximum number of simultaneous tasks you want your connector to support. The default is **1**. |
138137

139138
:::image type="content" source="./media/confluent-connectors/configuration-source.png" alt-text="Screenshot that shows the Configuration tab and creating a source connector in the Azure portal.":::
140139

141140
Select **Review + create** to continue.
142141

143142
### Review + create
144143

145-
Review your settings for the connector to ensure that the details are accurate and complete. Then, select **Create** to begin the connector deployment.
146-
147-
In the upper-right corner of the Azure portal, a notification displays the deployment status. When it shows the status *Completed*, refresh the **Confluent Connectors (Preview)** pane and check for the new connector tile on this pane.
148-
149-
## Manage Azure Confluent Connectors (preview)
150-
151-
1. In the Azure portal, go to your Confluent organization.
152-
1. On the left menu, select **Confluent** > **Confluent Connectors**.
153-
1. Select your environment and cluster.
154-
155-
The Azure portal shows a list of Azure connectors for the environment and cluster.
156-
157-
You can also complete the following optional actions:
158-
159-
* Filter connectors by **Type** (**Source** or **Sink**) and **Status** (**Running**, **Failed**, **Provisioning**, or **Paused**).
160-
* Search for a connector by name.
161-
162-
:::image type="content" source="./media/confluent-connectors/display-connectors.png" alt-text="Screenshot that shows a list of existing connectors on the Confluent Connectors tab in the Azure portal." lightbox="./media/confluent-connectors/display-connectors.png":::
144+
Review your settings for the connector to ensure that the details are accurate and complete. Then select **Create** to begin the connector deployment.
163145

164-
To learn more about a connector, select the connector tile to open Confluent. In the Confluent UI, you can see the connector health, throughput, and other information. You also can edit and delete the connector.
146+
In the upper-right corner of the Azure portal, a notification displays the deployment status. When the status is **Completed**, refresh the **Confluent Connectors (Preview)** pane and check for the new connector tile on this pane.
165147

166148
## Related content
167149

Lines changed: 137 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,137 @@
1+
---
2+
title: Create a Confluent Connector for Azure Cosmos DB (Preview)
3+
description: Learn how to use Confluent Connectors in Azure (preview) to connect an instance of Apache Kafka & Apache Flink on Confluent Cloud to Azure Cosmos DB.
4+
ms.topic: how-to
5+
ms.date: 10/30/2025
6+
ms.author: malev
7+
author: maud-lv
8+
9+
#customer intent: As a developer, I want to learn how to connect an instance of Apache Kafka & Apache Flink on Confluent Cloud to Azure Cosmos DB so that I can use Confluent Connectors in Azure.
10+
---
11+
12+
# Create a Confluent Connector to Azure Cosmos DB (preview)
13+
14+
Confluent Cloud helps you connect your Confluent clusters to popular data sources and sinks. You can take advantage of this solution on Azure by using the Confluent Connectors feature.
15+
16+
In this article, you'll learn how to connect an instance of Apache Kafka & Apache Flink on Confluent Cloud to Azure Cosmos DB.
17+
18+
## Prerequisites
19+
20+
* An Azure account with an active subscription. [Create an account for free](https://azure.microsoft.com/free).
21+
* An [Azure Cosmos DB](/azure/cosmos-db/) resource.
22+
* A [Confluent organization](./create.md) created in Azure Native Integrations.
23+
* The Owner or Contributor role for the Azure subscription. You might need to ask your subscription administrator to assign you one of these roles.
24+
* A [configured environment, cluster, and topic](https://docs.confluent.io/cloud/current/get-started/index.html) inside the Confluent organization. If you don't have one already, go to Confluent to create these components.
25+
26+
## Create a Confluent sink connector for Azure Cosmos DB (preview)
27+
28+
To create a sink connector for Azure Cosmos DB:
29+
30+
1. In the Azure portal, go to your Confluent organization.
31+
1. In the left pane, select **Data streaming** > **Confluent Connectors (Preview)**.
32+
1. Select **Create new connector**.
33+
34+
:::image type="content" source="./media/add-cosmos-db-connector/create-connector.png" alt-text="Screenshot that shows the steps for creating a connector." lightbox="./media/add-cosmos-db-connector/create-connector.png":::
35+
36+
1. In the **Create a new connector** pane, configure the settings that are described in the next sections.
37+
38+
### Basics
39+
40+
On the **Basics** tab, enter or select values for the following settings:
41+
42+
|Name|Action|
43+
|-|-|
44+
|**Connector Type**| Select **Sink**.|
45+
|**Connector Plugin**|Select **Azure Cosmos DB V2**.|
46+
|**Connector Name** |Enter a name for your connector. For example, *cosmos-sink-connector*. |
47+
|**Environment** |Select the environment where you want to create the connector.|
48+
|**Cluster** |Select the cluster where you want to create the connector.|
49+
|**Topics**|Select one or more Kafka topics to pull data from. |
50+
|**Cosmos DB Account** |Select the destination Azure Cosmos DB account in your Azure tenant. |
51+
|**Cosmos DB database** |Select the destination Azure Cosmos DB database under the account. |
52+
53+
:::image type="content" source="./media/add-cosmos-db-connector/create-connector-settings.png" alt-text="Screenshot that shows the settings for creating a connector." lightbox="./media/add-cosmos-db-connector/create-connector-settings.png":::
54+
55+
### Authentication
56+
57+
On the **Authentication** tab, select an authentication method: **User** or **Service account**.
58+
59+
- To use a service account (recommended for production), enter a **Service account** name and continue. A new service account will be provisioned in Confluent cloud when the connector is created.
60+
- To use a user account, leave **User** selected and continue. A user API key and secret will be created for the specific user in Confluent cloud when the connector is created.
61+
62+
:::image type="content" source="./media/add-cosmos-db-connector/authentication-tab.png" alt-text="Screenshot that shows the Authentication tab." lightbox="./media/add-cosmos-db-connector/authentication-tab.png":::
63+
64+
### Configuration
65+
66+
On the **Configuration** tab, enter or select the following values, and then select **Next**.
67+
68+
|Setting|Action|
69+
|-|-|
70+
|**Input Data Format**|Select an input Kafka record data format type: **AVRO**, **JSON**, **string**, or **Protobuf**.|
71+
|**Id Strategy**|Select the ID strategy used to derive the Azure Cosmos DB item ID. |
72+
|**Cosmos DB Write Configuration** |Select the write behavior for Azure Cosmos DB items. |
73+
|**Topic container map**|Map Kafka topics to Azure Cosmos DB containers. Use the format `topic1#container1,topic2#container2...`.|
74+
|**Number of tasks**|(Optional) Enter the maximum number of simultaneous tasks you want your connector to support. The default is **1**. |
75+
76+
:::image type="content" source="./media/add-cosmos-db-connector/configuration-tab.png" alt-text="Screenshot that shows the Configuration tab." lightbox="./media/add-cosmos-db-connector/configuration-tab.png":::
77+
78+
For more information, see [Azure Cosmos DB Sink V2 Connector for Confluent Cloud](https://docs.confluent.io/cloud/current/connectors/cc-azure-cosmos-sink-v2.html?utm_source=chatgpt.com).
79+
80+
Select **Review + create** to continue.
81+
82+
### Review + create
83+
84+
Review your settings for the connector to ensure that the details are accurate and complete. Then select **Create** to begin the connector deployment. In the upper-right corner of the Azure portal, a notification displays the deployment status. When it shows that the connector is created, refresh the **Confluent Connectors (Preview)** pane and check for the new connector tile.
85+
86+
## Create a Confluent source connector for Azure Cosmos DB (preview)
87+
88+
1. In the Azure portal, go to your Confluent organization.
89+
1. In the left pane, select **Data Streaming** > **Confluent Connectors (Preview)**.
90+
1. Select **Create new connector**.
91+
1. In the **Create a new connector** pane, configure the settings that are described in the following sections.
92+
93+
### Basics
94+
95+
On the **Basics** tab, enter or select values for the following settings:
96+
97+
|Setting|Action|
98+
|-|-|
99+
|**Connector Type**|Select **Source**.|
100+
|**Connector Class**|Select **Azure Cosmos DB V2**.|
101+
|**Connector Name**|Enter a name for your connector. For example, *cosmos-source-connector*.|
102+
|**Environment**|Select the environment where you want to create the connector. |
103+
|**Cluster**|Select the cluster where you want to create the connector.|
104+
|**Cosmos DB Account**|Select the source Azure Cosmos DB account. |
105+
|**Cosmos DB database**|Select the source Azure Cosmos DB database.|
106+
107+
:::image type="content" source="./media/add-cosmos-db-connector/source-basics-tab.png" alt-text="Screenshot that shows the Basics tab for creating a source connector." lightbox="./media/add-cosmos-db-connector/source-basics-tab.png":::
108+
109+
### Authentication
110+
111+
On the **Authentication** tab, select an authentication method: **User** or **Service account**.
112+
113+
- To use a service account (recommended for production), enter a **Service account** name and continue. A new service account will be provisioned in Confluent cloud when the connector is created.
114+
- To use a user account, leave **User** selected and continue. A user API key and secret will be created for the specific user in Confluent cloud when the connector is created.
115+
116+
:::image type="content" source="./media/add-cosmos-db-connector/source-authentication-tab.png" alt-text="Screenshot that shows the Authentication tab for creating a source connector." lightbox="./media/add-cosmos-db-connector/source-authentication-tab.png":::
117+
118+
### Configuration
119+
120+
On the **Configuration** tab, enter or select the following values, and then select **Next**.
121+
122+
|Name|Action|
123+
|-|-|
124+
|**Output Data Format** |Select an output Kafka record data format type: **AVRO**, **JSON**, **string**, or **Protobuf**.|
125+
|**Container topic map**|Map Azure Cosmos DB containers to Kafka topics. Use the format `container1#topic1,container2:topic2…`. |
126+
|**Number of tasks**|(Optional) Enter the maximum number of simultaneous tasks you want your connector to support. The default is **1**. |
127+
128+
:::image type="content" source="./media/add-cosmos-db-connector/source-configuration-tab.png" alt-text="Screenshot that shows the Configuration tab for creating a source connector." lightbox="./media/add-cosmos-db-connector/source-configuration-tab.png":::
129+
130+
Select **Review + create** to continue.
131+
132+
### Review + create
133+
134+
Review your settings for the connector to ensure that the details are accurate and complete. Then select **Create** to begin the connector deployment. In the upper-right corner of the Azure portal, a notification displays the deployment status. When it shows that the connector is created, refresh the **Confluent Connectors (Preview)** pane and check for the new connector tile.
135+
136+
## Related content
137+
- [Manage confluent connectors](manage-confluent-connectors.md)

0 commit comments

Comments
 (0)