You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
You can import AI model endpoints deployed in Microsoft Foundry to your API Management instance as APIs. Use AI gateway policies and other capabilities in API Management to simplify integration, improve observability, and enhance control over the model endpoints.
19
19
20
-
Learn more about managing AI APIs in API Management:
20
+
To learn more about managing AI APIs in API Management, see:
21
21
22
22
*[AI gateway capabilities in Azure API Management](genai-gateway-capabilities.md)
23
23
24
24
## Client compatibility options
25
25
26
-
API Management supports two client compatibility options for AI APIs from Microsoft Foundry. When you import the API using the wizard, choose the option suitable for your model deployment. The option determines how clients call the API and how the API Management instance routes requests to the Foundry tool.
26
+
API Management supports the following client compatibility options for AI APIs from Microsoft Foundry. When you import the API by using the wizard, choose the option suitable for your model deployment. The option determines how clients call the API and how the API Management instance routes requests to the Foundry tool.
27
27
28
28
***Azure OpenAI**: Manage Azure OpenAI in Microsoft Foundry model deployments.
29
29
30
-
Clients call the deployment at an `/openai` endpoint such as `/openai/deployments/my-deployment/chat/completions`. Deployment name is passed in the request path. Use this option if your Foundry tool only includes Azure OpenAI model deployments.
30
+
Clients call the deployment at an `/openai` endpoint such as `/openai/deployments/my-deployment/chat/completions`. The request path includes the deployment name. Use this option if your Foundry tool only includes Azure OpenAI model deployments.
31
31
32
32
***Azure AI**: Manage model endpoints in Microsoft Foundry that are exposed through the [Azure AI Model Inference API](/rest/api/aifoundry/modelinference/).
33
33
34
-
Clients call the deployment at a `/models` endpoint such as `/my-model/models/chat/completions`. Deployment name is passed in the request body. Use this option if you want flexibility to switch between models exposed through the Azure AI Model Inference API and those deployed in Azure OpenAI in Foundry Models.
34
+
Clients call the deployment at a `/models` endpoint such as `/my-model/models/chat/completions`. The request body includes the deployment name. Use this option if you want flexibility to switch between models exposed through the Azure AI Model Inference API and those deployed in Azure OpenAI in Foundry Models.
35
+
36
+
***Azure OpenAI v1** - Manage Azure OpenAI in Microsoft Foundry model deployments, using the [Azure OpenAI API version 1 API](/azure/foundry/openai/api-version-lifecycle).
37
+
38
+
Clients call the deployment at an Azure OpenAI v1 model endpoint such as `openai/v1/my-model/chat/completions`. The request body includes the deployment name.
35
39
36
40
## Prerequisites
37
41
38
42
* An existing API Management instance. [Create one if you haven't already](get-started-create-service-instance.md).
39
43
40
44
* A Foundry tool in your subscription with one or more models deployed. Examples include models deployed in Microsoft Foundry or Azure OpenAI.
41
45
42
-
## Import Microsoft Foundry API using the portal
46
+
- If you want to enable semantic caching for the API, see [Enable semantic caching of responses](azure-openai-enable-semantic-caching.md) for prerequisites.
47
+
48
+
- If you want to enforce content safety checks on the API, see [Enforce content safety checks on LLM requests](llm-content-safety-policy.md) for prerequisites.
49
+
50
+
## Import Microsoft Foundry API by using the portal
43
51
44
52
Use the following steps to import an AI API to API Management.
45
53
@@ -48,28 +56,29 @@ When you import the API, API Management automatically configures:
48
56
* Operations for each of the API's REST API endpoints.
49
57
* A system-assigned identity with the necessary permissions to access the Foundry tool deployment.
50
58
* A [backend](backends.md) resource and a [set-backend-service](set-backend-service-policy.md) policy that direct API requests to the Azure AI Services endpoint.
51
-
* Authentication to the backend using the instance's system-assigned managed identity.
59
+
* Authentication to the backend by using the instance's system-assigned managed identity.
52
60
* (optionally) Policies to help you monitor and manage the API.
53
61
54
62
To import a Microsoft Foundry API to API Management:
55
63
56
-
1. In the [Azure portal](https://portal.azure.com), navigate to your API Management instance.
64
+
1. In the [Azure portal](https://portal.azure.com), go to your API Management instance.
57
65
1. In the left menu, under **APIs**, select **APIs** > **+ Add API**.
58
66
1. Under **Create from Azure resource**, select **Microsoft Foundry**.
59
67
60
68
:::image type="content" source="media/azure-ai-foundry-api/ai-foundry-api.png" alt-text="Screenshot of creating an OpenAI-compatible API in the portal." :::
61
69
1. On the **Select AI Service** tab:
62
70
1. Select the **Subscription** in which to search for Foundry Tools. To get information about the model deployments in a service, select the **deployments** link next to the service name.
63
-
:::image type="content" source="media/azure-ai-foundry-api/deployments.png" alt-text="Screenshot of deployments for an AI service in the portal.":::
64
-
1. Select a Foundry tool.
71
+
:::image type="content" source="media/azure-ai-foundry-api/deployments.png" alt-text="Screenshot of deployments for an AI service in the portal." lightbox="media/azure-ai-foundry-api/deployments.png":::
72
+
1. Select a Foundry tool.
65
73
1. Select **Next**.
66
74
1. On the **Configure API** tab:
67
75
1. Enter a **Display name** and optional **Description** for the API.
68
76
1. In **Base path**, enter a path that your API Management instance uses to access the deployment endpoint.
69
-
1. Optionally, select one or more **Products** to associate with the API.
70
-
1. In **Client compatibility**, select either of the following based on the types of client you intend to support. See [Client compatibility options](#client-compatibility-options) for more information.
71
-
***Azure OpenAI**: Select this option if your clients only need to access Azure OpenAI in Microsoft Foundry model deployments.
72
-
***Azure AI**: Select this option if your clients need to access other models in Microsoft Foundry.
77
+
1. Optionally select one or more **Products** to associate with the API.
78
+
1. In **Client compatibility**, select one of the following options based on the types of client you intend to support. See [Client compatibility options](#client-compatibility-options) for more information.
79
+
***Azure OpenAI** - Select this option if your clients only need to access Azure OpenAI in Microsoft Foundry model deployments.
80
+
***Azure AI** - Select this option if your clients need to access other models in Microsoft Foundry.
81
+
***Azure OpenAI v1** - Select this option if you want to use the Azure OpenAI API version 1 with your Foundry model deployments.
73
82
1. Select **Next**.
74
83
75
84
:::image type="content" source="media/azure-ai-foundry-api/client-compatibility.png" alt-text="Screenshot of Microsoft Foundry API configuration in the portal.":::
@@ -79,15 +88,14 @@ To import a Microsoft Foundry API to API Management:
1. On the **Apply semantic caching** tab, optionally enter settings, or accept defaults that define the policies to help optimize performance and reduce latency for the API:
81
90
*[Enable semantic caching of responses](azure-openai-enable-semantic-caching.md)
82
-
1. On the **AI content safety**, optionally enter settings, or accept defaults to configure the Azure AI Content Safety service to block prompts with unsafe content:
91
+
1. On the **AI content safety** tab, optionally enter settings or accept defaults to configure the Azure AI Content Safety service to block prompts with unsafe content:
83
92
*[Enforce content safety checks on LLM requests](llm-content-safety-policy.md)
84
93
1. Select **Review**.
85
-
1. After settings are validated, select **Create**.
94
+
1. After the portal validates the settings, select **Create**.
86
95
87
96
## Test the AI API
88
97
89
-
To ensure that your AI API is working as expected, test it in the API Management test console.
90
-
98
+
To make sure your AI API works as expected, test it in the API Management test console.
91
99
1. Select the API you created in the previous step.
92
100
1. Select the **Test** tab.
93
101
1. Select an operation that's compatible with the model deployment.
@@ -108,9 +116,9 @@ To ensure that your AI API is working as expected, test it in the API Management
108
116
```
109
117
110
118
> [!NOTE]
111
-
> In the test console, API Management automatically populates an **Ocp-Apim-Subscription-Key** header, and configures the subscription key of the built-in [all-access subscription](api-management-subscriptions.md#all-access-subscription). This key enables access to every API in the API Management instance. Optionally display the **Ocp-Apim-Subscription-Key** header by selecting the "eye" icon next to the **HTTP Request**.
119
+
> In the test console, API Management automatically adds an **Ocp-Apim-Subscription-Key** header and sets the subscription key for the built-in [all-access subscription](api-management-subscriptions.md#all-access-subscription). This key provides access to every API in the API Management instance. To optionally display the **Ocp-Apim-Subscription-Key** header, select the "eye" icon next to the **HTTP Request**.
112
120
1. Select **Send**.
113
121
114
-
When the test is successful, the backend responds with a successful HTTP response code and some data. Appended to the response is token usage data to help you monitor and manage your language model token consumption.
122
+
When the test is successful, the backend responds with a successful HTTP response code and some data. The response includes token usage data to help you monitor and manage your language model token consumption.
Copy file name to clipboardExpand all lines: articles/app-service/app-service-configuration-references.md
+59-9Lines changed: 59 additions & 9 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -4,7 +4,7 @@ description: Set up Azure App Service and Azure Functions to use App Configurati
4
4
author: muksvso
5
5
6
6
ms.topic: how-to
7
-
ms.date: 05/08/2025
7
+
ms.date: 03/30/2026
8
8
ms.author: mubatra
9
9
10
10
#customer intent: As a developer, I want to use Azure App Configuration references so that I can make configuration key/value pairs available to code.
@@ -17,6 +17,20 @@ ms.custom: sfi-ropc-nochange
17
17
18
18
This article shows how to work with configuration data in Azure App Service or Azure Functions applications without making any code changes. [Azure App Configuration](../azure-app-configuration/overview.md) is an Azure service that you can use to centrally manage application configuration. It's also an effective tool for auditing your configuration values over time or across releases.
19
19
20
+
## Important notes for Azure Functions local development
21
+
22
+
App Configuration references (`@Microsoft.AppConfiguration(...)`) are resolved by the Azure App Service/Functions platform when your app runs in Azure.
23
+
24
+
-**Azure (supported):** Put the reference in your function app's **Application settings** (for example, in the Azure portal, ARM/Bicep, or other deployment tooling).
25
+
-**Local (not supported):** The Functions host running on your development machine doesn't resolve `@Microsoft.AppConfiguration(...)` values from *local.settings.json*.
26
+
-**User secrets (not supported):** The Functions user secrets store (*secrets.json*) is also not processed for `@Microsoft.AppConfiguration(...)` references.
27
+
-**SDK code (not required for this feature):** Calling `AddAzureAppConfiguration()` configures the App Configuration SDK for in-process resolution, but it doesn't make the platform resolve `@Microsoft.AppConfiguration(...)` references locally.
28
+
29
+
If you want the same configuration values locally, use one of the following approaches:
30
+
31
+
- Add the values directly to *local.settings.json* (for example, set `MySetting` to the literal value you want locally).
32
+
- Use the App Configuration SDK in your app code (for example, by configuring `AddAzureAppConfiguration()` and connecting to your store with a connection string or credentials appropriate for local dev). This approach is separate from platform references.
33
+
20
34
## Grant app access to App Configuration
21
35
22
36
To get started with using App Configuration references in App Service, first you create an App Configuration store. You then grant permissions to your app to access the configuration key/value pairs that are in the store.
@@ -78,20 +92,58 @@ An App Configuration reference has the form `@Microsoft.AppConfiguration({refere
78
92
Here's an example of a complete reference that includes `Label`:
Any configuration change to the app that results in a site restart causes an immediate refetch of all referenced key/value pairs from the App Configuration store.
91
105
92
106
> [!NOTE]
93
107
> Automatic refresh and refetch of these values when the key/value pairs are updated in App Configuration isn't currently supported.
94
108
109
+
## Working example (Azure Functions)
110
+
111
+
The following example shows where the `@Microsoft.AppConfiguration(...)` syntax goes for an Azure Functions app.
112
+
113
+
### 1) Create a key/value in App Configuration
114
+
115
+
In your App Configuration store, create a key/value pair:
116
+
117
+
-**Key:**`Demo:Color`
118
+
-**Label:** (optional) `dev`
119
+
-**Value:**`Blue`
120
+
121
+
### 2) Add an application setting to your function app in Azure
122
+
123
+
In your Function App (in Azure), add an application setting named `Demo__Color` and set its value to an App Configuration reference.
124
+
125
+
> [!NOTE]
126
+
> Use double underscores (`__`) if you want .NET configuration binding to map to `Demo:Color`.
At runtime, your code reads `Demo:Color` like any other app setting.
143
+
144
+
> [!TIP]
145
+
> You don't need to call `AddAzureAppConfiguration()` for platform references. Use `AddAzureAppConfiguration()` only when you want to load configuration directly via the SDK.
146
+
95
147
## Source application settings from App Configuration
96
148
97
149
You can use App Configuration references as values for [application settings](configure-common.md#configure-app-settings) so you can keep configuration data in App Configuration instead of in the site configuration settings. Application settings and App Configuration key/value pairs are both securely encrypted at rest. If you need centralized configuration management capabilities, add configuration data to App Configuration.
@@ -162,8 +214,8 @@ Here's a sample template for a function app that has App Configuration reference
0 commit comments