You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: articles/azure-functions/functions-bindings-mcp.md
+8-1Lines changed: 8 additions & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -86,6 +86,9 @@ You can use the `extensions.mcp` section in `host.json` to define MCP server inf
86
86
"encryptClientState": true,
87
87
"messageOptions": {
88
88
"useAbsoluteUriForEndpoint": false
89
+
},
90
+
"system": {
91
+
"webhookAuthorizationLevel": "System"
89
92
}
90
93
}
91
94
}
@@ -100,6 +103,8 @@ You can use the `extensions.mcp` section in `host.json` to define MCP server inf
100
103
|**encryptClientState**| Determines if client state is encrypted. Defaults to true. Setting to false may be useful for debugging and test scenarios but isn't recommended for production. |
101
104
|**messageOptions**| Options object for the message endpoint in the SSE transport. |
102
105
|**messageOptions.UseAbsoluteUriForEndpoint**| Defaults to `false`. Only applicable to the server-sent events (SSE) transport; this setting doesn't affect the Streamable HTTP transport. If set to `false`, the message endpoint is provided as a relative URI during initial connections over the SSE transport. If set to `true`, the message endpoint is returned as an absolute URI. Using a relative URI isn't recommended unless you have a specific reason to do so.|
106
+
|**system**| Options object for system-level configuration. |
107
+
|**system.webhookAuthorizationLevel**| Defines the authorization level required for the webhook endpoint. Defaults to "System". Allowed values are "System" and "Anonymous". When you set the value to "Anonymous", an access key is no longer required for requests. Regardless of if a key is required or not, you can use [built-in MCP server authorization][authorization] as an identity-based access control layer.<br/>This setting is only available when running on Functions host version 4.1045.0 or later.|
103
108
104
109
## Connect to your MCP server
105
110
@@ -112,7 +117,9 @@ To connect to the MCP server exposed by your function app, you need to provide a
112
117
113
118
<sup>1</sup> Newer protocol versions deprecated the Server-Sent Events transport. Unless your client specifically requires it, you should use the Streamable HTTP transport instead.
114
119
115
-
When hosted in Azure, the endpoints exposed by the extension also require the [system key](./function-keys-how-to.md) named `mcp_extension`. If it isn't provided in the `x-functions-key` HTTP header or in the `code` query string parameter, your client receives a `401 Unauthorized` response. You can retrieve the key using any of the methods described in [Get your function access keys](./function-keys-how-to.md#get-your-function-access-keys). The following example shows how to get the key with the Azure CLI:
120
+
When hosted in Azure, by default, the endpoints exposed by the extension also require the [system key](./function-keys-how-to.md) named `mcp_extension`. If it isn't provided in the `x-functions-key` HTTP header or in the `code` query string parameter, your client receives a `401 Unauthorized` response. You can remove this requirement by setting the `system.webhookAuthorizationLevel` property in `host.json` to `Anonymous`. For more information, see the [host.json settings](#hostjson-settings) section.
121
+
122
+
You can retrieve the key using any of the methods described in [Get your function access keys](./function-keys-how-to.md#get-your-function-access-keys). The following example shows how to get the key with the Azure CLI:
116
123
117
124
```azurecli
118
125
az functionapp keys list --resource-group <RESOURCE_GROUP> --name <APP_NAME> --query systemKeys.mcp_extension --output tsv
Copy file name to clipboardExpand all lines: articles/partner-solutions/apache-kafka-confluent-cloud/add-confluent-connectors.md
+27-45Lines changed: 27 additions & 45 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -1,22 +1,19 @@
1
1
---
2
-
title: Use Confluent Connectors in Azure (Preview)
2
+
title: Create a Confluent Connector for Azure Blob Storage (Preview)
3
3
description: Learn how to use Confluent Connectors in Azure (preview) to connect an instance of Apache Kafka & Apache Flink on Confluent Cloud to Azure Blob Storage.
4
4
ms.topic: how-to
5
-
ms.date: 05/28/2024
5
+
ms.date: 10/30/2025
6
6
ms.author: malev
7
7
author: maud-lv
8
8
9
-
#customer intent: As a developer, I want learn how to connect an instance of Apache Kafka & Apache Flink on Confluent Cloud to Azure Blob Storage so that I can use Confluent Connectors in Azure.
9
+
#customer intent: As a developer, I want to learn how to connect an instance of Apache Kafka & Apache Flink on Confluent Cloud to Azure Blob Storage so that I can use Confluent Connectors in Azure.
10
10
---
11
11
12
-
# Use Confluent Connectors in Azure (preview)
12
+
# Create a Confluent Connector to Azure Blob Storage (preview)
13
13
14
-
Confluent Cloud helps you connect your Confluent clusters to popular data sources and sinks. The solution is available on Azure by using the Confluent Connectors feature.
14
+
Confluent Cloud helps you connect your Confluent clusters to popular data sources and sinks. You can take advantage of this solution on Azure by using the Confluent Connectors feature.
15
15
16
-
> [!NOTE]
17
-
> Currently, Apache Kafka & Apache Flink on Confluent Cloud, an Azure Native Integrations service, supports only Confluent Connectors for Azure Blob Storage. It supports both source and sink connectors in Azure Blob Storage.
18
-
19
-
In this article, learn how to connect an instance of Apache Kafka & Apache Flink on Confluent Cloud to Azure Blob Storage.
16
+
In this article, you'll learn how to connect an instance of Apache Kafka & Apache Flink on Confluent Cloud to Azure Blob Storage.
20
17
21
18
## Prerequisites
22
19
@@ -36,7 +33,7 @@ To create a sink connector for Azure Blob Storage:
36
33
:::image type="content" source="./media/confluent-connectors/create-new-connector.png" alt-text="Screenshot that shows the Confluent Connectors menu in the Azure portal.":::
37
34
38
35
1. Select **Create new connector**.
39
-
1.On the **Create a new connector** pane, configure the settings that are described in the next sections.
36
+
1.In the **Create a new connector** pane, configure the settings that are described in the next sections.
40
37
41
38
### Basics
42
39
@@ -56,13 +53,14 @@ On the **Basics** tab, enter or select values for the following settings:
56
53
57
54
:::image type="content" source="./media/confluent-connectors/basic-sink.png" alt-text="Screenshot that shows the Basics tab and creating a sink connector in the Azure portal.":::
58
55
59
-
Then, select**Next**.
56
+
Select**Next**.
60
57
61
58
### Authentication
62
59
63
-
On the **Authentication** tab, you can configure the authentication of your Kafka cluster via API keys. By default, **Create New**is selected and API keys are automatically generated and configured when the connector is created.
60
+
On the **Authentication** tab, select an authentication method: **User**or **Service account**.
64
61
65
-
Leave the default values and select the **Configuration** tab.
62
+
- To use a service account (recommended for production), enter a **Service account** name and continue. A new service account will be provisioned in Confluent cloud when the connector is created.
63
+
- To use a user account, leave **User** selected and continue. A user API key and secret will be created for the specific user in Confluent cloud when the connector is created.
66
64
67
65
:::image type="content" source="./media/confluent-connectors/authentication.png" alt-text="Screenshot that shows the Authentication tab in the Azure portal.":::
68
66
@@ -72,30 +70,30 @@ On the **Configuration** tab, enter or select the following values, and then sel
72
70
73
71
| Setting | Action |
74
72
| --- | --- |
75
-
|**Input Data Format**| Select an input Kafka record data format type: AVRO, JSON, string, or Protobuf. |
76
-
|**Output Data Format**| Select an output data format: AVRO, JSON, string, or Protobuf. |
73
+
|**Input Data Format**| Select an input Kafka record data format type: **AVRO**, **JSON**, **string**, or **Protobuf**. |
74
+
|**Output Data Format**| Select an output data format: **AVRO**, **JSON**, **string**, or **Protobuf**. |
77
75
|**Time Interval**| Select the time interval in which to group the data. Choose between hourly and daily. |
78
76
|**Flush size**| Optionally, you can enter a flush size. The default flush size is 1,000. |
79
-
|**Number of tasks**| Optionally, you can enter the maximum number of simultaneous tasks you want your connector to support. The default is 1. |
77
+
|**Number of tasks**| Optionally, you can enter the maximum number of simultaneous tasks you want your connector to support. The default is **1**. |
80
78
81
79
:::image type="content" source="./media/confluent-connectors/configuration-sink.png" alt-text="Screenshot that shows the Configuration tab for a sink connector in the Azure portal.":::
82
80
83
81
Select **Review + create** to continue.
84
82
85
83
### Review + create
86
84
87
-
Review your settings for the connector to ensure that the details are accurate and complete. Then, select **Create** to begin the connector deployment.
85
+
Review your settings for the connector to ensure that the details are accurate and complete. Then select **Create** to begin the connector deployment.
88
86
89
-
In the upper-right corner of the Azure portal, a notification displays the deployment status. When it shows the status *Completed*, refresh the **Confluent Connectors (Preview)** pane and check for the new connector tile on this pane.
87
+
In the upper-right corner of the Azure portal, a notification displays the deployment status. When the status is **Completed**, refresh the **Confluent Connectors (Preview)** pane and check for the new connector tile on this pane.
90
88
91
89
## Create a Confluent source Connector for Azure Blob Storage (preview)
92
90
93
91
1. In the Azure portal, go to your Confluent organization.
94
-
1.On the left menu, select **Confluent** > **Confluent Connectors (Preview)**.
92
+
1.In the left pane, select **Confluent** > **Confluent Connectors (Preview)**.
95
93
96
94
:::image type="content" source="./media/confluent-connectors/create-new-connector.png" alt-text="Screenshot that shows the Confluent Connectors menu in the Azure portal.":::
97
95
98
-
1.On the **Create a new connector** pane, select **Create new connector**.
96
+
1.In the **Create a new connector** pane, select **Create new connector**.
99
97
100
98
### Basics
101
99
@@ -114,13 +112,14 @@ On the **Basics** tab, enter or select values for the following settings:
114
112
115
113
:::image type="content" source="./media/confluent-connectors/basic-source.png" alt-text="Screenshot that shows the Basics tab and creating a source connector in the Azure portal.":::
116
114
117
-
Then, select**Next**.
115
+
Select**Next**.
118
116
119
117
### Authentication
120
118
121
-
On the **Authentication** tab, you can configure the authentication of your Kafka cluster via API keys. By default, **Create New**is selected and API keys are automatically generated and configured when the connector is created.
119
+
On the **Authentication** tab, select an authentication method: **User**or **Service account**.
122
120
123
-
Leave the default values and select the **Configuration** tab.
121
+
- To use a service account (recommended for production), enter a **Service account** name and continue. A new service account will be provisioned in Confluent cloud when the connector is created.
122
+
- To use a user account, leave **User** selected and continue. A user API key and secret will be created for the specific user in Confluent cloud when the connector is created.
124
123
125
124
:::image type="content" source="./media/confluent-connectors/authentication.png" alt-text="Screenshot that shows the Authentication tab in the Azure portal.":::
126
125
@@ -130,38 +129,21 @@ On the **Configuration** tab, enter or select values for the following settings:
130
129
131
130
| Name | Action |
132
131
| --- | --- |
133
-
|**Input Data Format**| Select an input Kafka record data format type: AVRO, JSON, string, Protobuf. |
134
-
|**Output Data Format**| Select an output data format: AVRO, JSON, string, or Protobuf. |
132
+
|**Input Data Format**| Select an input Kafka record data format type: **AVRO**, **JSON**, **string**, **Protobuf**. |
133
+
|**Output Data Format**| Select an output data format: **AVRO**, **JSON**, **string**, or **Protobuf**. |
135
134
|**Topic name and regex**| Configure the topic name and the regex pattern of your messages to ensure they're mapped. For example, `*my-topic:.*\.json+` moves all the files that have the `.json` extension into `my-topic`. |
136
135
|**Flush size**| (Optional) Enter a flush size. The default flush size is 1,000. |
137
-
|**Number of tasks**| (Optional) Enter the maximum number of simultaneous tasks you want your connector to support. The default is 1. |
136
+
|**Number of tasks**| (Optional) Enter the maximum number of simultaneous tasks you want your connector to support. The default is **1**. |
138
137
139
138
:::image type="content" source="./media/confluent-connectors/configuration-source.png" alt-text="Screenshot that shows the Configuration tab and creating a source connector in the Azure portal.":::
140
139
141
140
Select **Review + create** to continue.
142
141
143
142
### Review + create
144
143
145
-
Review your settings for the connector to ensure that the details are accurate and complete. Then, select **Create** to begin the connector deployment.
146
-
147
-
In the upper-right corner of the Azure portal, a notification displays the deployment status. When it shows the status *Completed*, refresh the **Confluent Connectors (Preview)** pane and check for the new connector tile on this pane.
148
-
149
-
## Manage Azure Confluent Connectors (preview)
150
-
151
-
1. In the Azure portal, go to your Confluent organization.
152
-
1. On the left menu, select **Confluent** > **Confluent Connectors**.
153
-
1. Select your environment and cluster.
154
-
155
-
The Azure portal shows a list of Azure connectors for the environment and cluster.
156
-
157
-
You can also complete the following optional actions:
158
-
159
-
* Filter connectors by **Type** (**Source** or **Sink**) and **Status** (**Running**, **Failed**, **Provisioning**, or **Paused**).
160
-
* Search for a connector by name.
161
-
162
-
:::image type="content" source="./media/confluent-connectors/display-connectors.png" alt-text="Screenshot that shows a list of existing connectors on the Confluent Connectors tab in the Azure portal." lightbox="./media/confluent-connectors/display-connectors.png":::
144
+
Review your settings for the connector to ensure that the details are accurate and complete. Then select **Create** to begin the connector deployment.
163
145
164
-
To learn more about a connector, select the connector tile to open Confluent. In the Confluent UI, you can see the connector health, throughput, and other information. You also can edit and delete the connector.
146
+
In the upper-right corner of the Azure portal, a notification displays the deployment status. When the status is **Completed**, refresh the **Confluent Connectors (Preview)** pane and check for the new connector tile on this pane.
0 commit comments