Skip to content

Commit 20ed874

Browse files
committed
Merge branch 'main' of https://github.com/MicrosoftDocs/azure-docs-pr into afd-faq
2 parents 6dc2478 + 1cf72ec commit 20ed874

15 files changed

Lines changed: 257 additions & 53 deletions

articles/azure-functions/functions-bindings-mcp.md

Lines changed: 8 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -86,6 +86,9 @@ You can use the `extensions.mcp` section in `host.json` to define MCP server inf
8686
"encryptClientState": true,
8787
"messageOptions": {
8888
"useAbsoluteUriForEndpoint": false
89+
},
90+
"system": {
91+
"webhookAuthorizationLevel": "System"
8992
}
9093
}
9194
}
@@ -100,6 +103,8 @@ You can use the `extensions.mcp` section in `host.json` to define MCP server inf
100103
| **encryptClientState** | Determines if client state is encrypted. Defaults to true. Setting to false may be useful for debugging and test scenarios but isn't recommended for production. |
101104
| **messageOptions** | Options object for the message endpoint in the SSE transport. |
102105
| **messageOptions.UseAbsoluteUriForEndpoint** | Defaults to `false`. Only applicable to the server-sent events (SSE) transport; this setting doesn't affect the Streamable HTTP transport. If set to `false`, the message endpoint is provided as a relative URI during initial connections over the SSE transport. If set to `true`, the message endpoint is returned as an absolute URI. Using a relative URI isn't recommended unless you have a specific reason to do so.|
106+
| **system** | Options object for system-level configuration. |
107+
| **system.webhookAuthorizationLevel** | Defines the authorization level required for the webhook endpoint. Defaults to "System". Allowed values are "System" and "Anonymous". When you set the value to "Anonymous", an access key is no longer required for requests. Regardless of if a key is required or not, you can use [built-in MCP server authorization][authorization] as an identity-based access control layer.<br/>This setting is only available when running on Functions host version 4.1045.0 or later.|
103108

104109
## Connect to your MCP server
105110

@@ -112,7 +117,9 @@ To connect to the MCP server exposed by your function app, you need to provide a
112117

113118
<sup>1</sup> Newer protocol versions deprecated the Server-Sent Events transport. Unless your client specifically requires it, you should use the Streamable HTTP transport instead.
114119

115-
When hosted in Azure, the endpoints exposed by the extension also require the [system key](./function-keys-how-to.md) named `mcp_extension`. If it isn't provided in the `x-functions-key` HTTP header or in the `code` query string parameter, your client receives a `401 Unauthorized` response. You can retrieve the key using any of the methods described in [Get your function access keys](./function-keys-how-to.md#get-your-function-access-keys). The following example shows how to get the key with the Azure CLI:
120+
When hosted in Azure, by default, the endpoints exposed by the extension also require the [system key](./function-keys-how-to.md) named `mcp_extension`. If it isn't provided in the `x-functions-key` HTTP header or in the `code` query string parameter, your client receives a `401 Unauthorized` response. You can remove this requirement by setting the `system.webhookAuthorizationLevel` property in `host.json` to `Anonymous`. For more information, see the [host.json settings](#hostjson-settings) section.
121+
122+
You can retrieve the key using any of the methods described in [Get your function access keys](./function-keys-how-to.md#get-your-function-access-keys). The following example shows how to get the key with the Azure CLI:
116123

117124
```azurecli
118125
az functionapp keys list --resource-group <RESOURCE_GROUP> --name <APP_NAME> --query systemKeys.mcp_extension --output tsv

articles/partner-solutions/apache-kafka-confluent-cloud/add-confluent-connectors.md

Lines changed: 27 additions & 45 deletions
Original file line numberDiff line numberDiff line change
@@ -1,22 +1,19 @@
11
---
2-
title: Use Confluent Connectors in Azure (Preview)
2+
title: Create a Confluent Connector for Azure Blob Storage (Preview)
33
description: Learn how to use Confluent Connectors in Azure (preview) to connect an instance of Apache Kafka & Apache Flink on Confluent Cloud to Azure Blob Storage.
44
ms.topic: how-to
5-
ms.date: 05/28/2024
5+
ms.date: 10/30/2025
66
ms.author: malev
77
author: maud-lv
88

9-
#customer intent: As a developer, I want learn how to connect an instance of Apache Kafka & Apache Flink on Confluent Cloud to Azure Blob Storage so that I can use Confluent Connectors in Azure.
9+
#customer intent: As a developer, I want to learn how to connect an instance of Apache Kafka & Apache Flink on Confluent Cloud to Azure Blob Storage so that I can use Confluent Connectors in Azure.
1010
---
1111

12-
# Use Confluent Connectors in Azure (preview)
12+
# Create a Confluent Connector to Azure Blob Storage (preview)
1313

14-
Confluent Cloud helps you connect your Confluent clusters to popular data sources and sinks. The solution is available on Azure by using the Confluent Connectors feature.
14+
Confluent Cloud helps you connect your Confluent clusters to popular data sources and sinks. You can take advantage of this solution on Azure by using the Confluent Connectors feature.
1515

16-
> [!NOTE]
17-
> Currently, Apache Kafka & Apache Flink on Confluent Cloud, an Azure Native Integrations service, supports only Confluent Connectors for Azure Blob Storage. It supports both source and sink connectors in Azure Blob Storage.
18-
19-
In this article, learn how to connect an instance of Apache Kafka & Apache Flink on Confluent Cloud to Azure Blob Storage.
16+
In this article, you'll learn how to connect an instance of Apache Kafka & Apache Flink on Confluent Cloud to Azure Blob Storage.
2017

2118
## Prerequisites
2219

@@ -36,7 +33,7 @@ To create a sink connector for Azure Blob Storage:
3633
:::image type="content" source="./media/confluent-connectors/create-new-connector.png" alt-text="Screenshot that shows the Confluent Connectors menu in the Azure portal.":::
3734

3835
1. Select **Create new connector**.
39-
1. On the **Create a new connector** pane, configure the settings that are described in the next sections.
36+
1. In the **Create a new connector** pane, configure the settings that are described in the next sections.
4037

4138
### Basics
4239

@@ -56,13 +53,14 @@ On the **Basics** tab, enter or select values for the following settings:
5653

5754
:::image type="content" source="./media/confluent-connectors/basic-sink.png" alt-text="Screenshot that shows the Basics tab and creating a sink connector in the Azure portal.":::
5855

59-
Then, select **Next**.
56+
Select **Next**.
6057

6158
### Authentication
6259

63-
On the **Authentication** tab, you can configure the authentication of your Kafka cluster via API keys. By default, **Create New** is selected and API keys are automatically generated and configured when the connector is created.
60+
On the **Authentication** tab, select an authentication method: **User** or **Service account**.
6461

65-
Leave the default values and select the **Configuration** tab.
62+
- To use a service account (recommended for production), enter a **Service account** name and continue. A new service account will be provisioned in Confluent cloud when the connector is created.
63+
- To use a user account, leave **User** selected and continue. A user API key and secret will be created for the specific user in Confluent cloud when the connector is created.
6664

6765
:::image type="content" source="./media/confluent-connectors/authentication.png" alt-text="Screenshot that shows the Authentication tab in the Azure portal.":::
6866

@@ -72,30 +70,30 @@ On the **Configuration** tab, enter or select the following values, and then sel
7270

7371
| Setting | Action |
7472
| --- | --- |
75-
| **Input Data Format** | Select an input Kafka record data format type: AVRO, JSON, string, or Protobuf. |
76-
| **Output Data Format** | Select an output data format: AVRO, JSON, string, or Protobuf. |
73+
| **Input Data Format** | Select an input Kafka record data format type: **AVRO**, **JSON**, **string**, or **Protobuf**. |
74+
| **Output Data Format** | Select an output data format: **AVRO**, **JSON**, **string**, or **Protobuf**. |
7775
| **Time Interval** | Select the time interval in which to group the data. Choose between hourly and daily. |
7876
| **Flush size** | Optionally, you can enter a flush size. The default flush size is 1,000. |
79-
| **Number of tasks** | Optionally, you can enter the maximum number of simultaneous tasks you want your connector to support. The default is 1. |
77+
| **Number of tasks** | Optionally, you can enter the maximum number of simultaneous tasks you want your connector to support. The default is **1**. |
8078

8179
:::image type="content" source="./media/confluent-connectors/configuration-sink.png" alt-text="Screenshot that shows the Configuration tab for a sink connector in the Azure portal.":::
8280

8381
Select **Review + create** to continue.
8482

8583
### Review + create
8684

87-
Review your settings for the connector to ensure that the details are accurate and complete. Then, select **Create** to begin the connector deployment.
85+
Review your settings for the connector to ensure that the details are accurate and complete. Then select **Create** to begin the connector deployment.
8886

89-
In the upper-right corner of the Azure portal, a notification displays the deployment status. When it shows the status *Completed*, refresh the **Confluent Connectors (Preview)** pane and check for the new connector tile on this pane.
87+
In the upper-right corner of the Azure portal, a notification displays the deployment status. When the status is **Completed**, refresh the **Confluent Connectors (Preview)** pane and check for the new connector tile on this pane.
9088

9189
## Create a Confluent source Connector for Azure Blob Storage (preview)
9290

9391
1. In the Azure portal, go to your Confluent organization.
94-
1. On the left menu, select **Confluent** > **Confluent Connectors (Preview)**.
92+
1. In the left pane, select **Confluent** > **Confluent Connectors (Preview)**.
9593

9694
:::image type="content" source="./media/confluent-connectors/create-new-connector.png" alt-text="Screenshot that shows the Confluent Connectors menu in the Azure portal.":::
9795

98-
1. On the **Create a new connector** pane, select **Create new connector**.
96+
1. In the **Create a new connector** pane, select **Create new connector**.
9997

10098
### Basics
10199

@@ -114,13 +112,14 @@ On the **Basics** tab, enter or select values for the following settings:
114112

115113
:::image type="content" source="./media/confluent-connectors/basic-source.png" alt-text="Screenshot that shows the Basics tab and creating a source connector in the Azure portal.":::
116114

117-
Then, select **Next**.
115+
Select **Next**.
118116

119117
### Authentication
120118

121-
On the **Authentication** tab, you can configure the authentication of your Kafka cluster via API keys. By default, **Create New** is selected and API keys are automatically generated and configured when the connector is created.
119+
On the **Authentication** tab, select an authentication method: **User** or **Service account**.
122120

123-
Leave the default values and select the **Configuration** tab.
121+
- To use a service account (recommended for production), enter a **Service account** name and continue. A new service account will be provisioned in Confluent cloud when the connector is created.
122+
- To use a user account, leave **User** selected and continue. A user API key and secret will be created for the specific user in Confluent cloud when the connector is created.
124123

125124
:::image type="content" source="./media/confluent-connectors/authentication.png" alt-text="Screenshot that shows the Authentication tab in the Azure portal.":::
126125

@@ -130,38 +129,21 @@ On the **Configuration** tab, enter or select values for the following settings:
130129

131130
| Name | Action |
132131
| --- | --- |
133-
| **Input Data Format** | Select an input Kafka record data format type: AVRO, JSON, string, Protobuf. |
134-
| **Output Data Format** | Select an output data format: AVRO, JSON, string, or Protobuf. |
132+
| **Input Data Format** | Select an input Kafka record data format type: **AVRO**, **JSON**, **string**, **Protobuf**. |
133+
| **Output Data Format** | Select an output data format: **AVRO**, **JSON**, **string**, or **Protobuf**. |
135134
| **Topic name and regex** | Configure the topic name and the regex pattern of your messages to ensure they're mapped. For example, `*my-topic:.*\.json+` moves all the files that have the `.json` extension into `my-topic`. |
136135
| **Flush size** | (Optional) Enter a flush size. The default flush size is 1,000. |
137-
| **Number of tasks** | (Optional) Enter the maximum number of simultaneous tasks you want your connector to support. The default is 1. |
136+
| **Number of tasks** | (Optional) Enter the maximum number of simultaneous tasks you want your connector to support. The default is **1**. |
138137

139138
:::image type="content" source="./media/confluent-connectors/configuration-source.png" alt-text="Screenshot that shows the Configuration tab and creating a source connector in the Azure portal.":::
140139

141140
Select **Review + create** to continue.
142141

143142
### Review + create
144143

145-
Review your settings for the connector to ensure that the details are accurate and complete. Then, select **Create** to begin the connector deployment.
146-
147-
In the upper-right corner of the Azure portal, a notification displays the deployment status. When it shows the status *Completed*, refresh the **Confluent Connectors (Preview)** pane and check for the new connector tile on this pane.
148-
149-
## Manage Azure Confluent Connectors (preview)
150-
151-
1. In the Azure portal, go to your Confluent organization.
152-
1. On the left menu, select **Confluent** > **Confluent Connectors**.
153-
1. Select your environment and cluster.
154-
155-
The Azure portal shows a list of Azure connectors for the environment and cluster.
156-
157-
You can also complete the following optional actions:
158-
159-
* Filter connectors by **Type** (**Source** or **Sink**) and **Status** (**Running**, **Failed**, **Provisioning**, or **Paused**).
160-
* Search for a connector by name.
161-
162-
:::image type="content" source="./media/confluent-connectors/display-connectors.png" alt-text="Screenshot that shows a list of existing connectors on the Confluent Connectors tab in the Azure portal." lightbox="./media/confluent-connectors/display-connectors.png":::
144+
Review your settings for the connector to ensure that the details are accurate and complete. Then select **Create** to begin the connector deployment.
163145

164-
To learn more about a connector, select the connector tile to open Confluent. In the Confluent UI, you can see the connector health, throughput, and other information. You also can edit and delete the connector.
146+
In the upper-right corner of the Azure portal, a notification displays the deployment status. When the status is **Completed**, refresh the **Confluent Connectors (Preview)** pane and check for the new connector tile on this pane.
165147

166148
## Related content
167149

0 commit comments

Comments
 (0)