You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: articles/api-management/azure-ai-foundry-api.md
+15-6Lines changed: 15 additions & 6 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -5,7 +5,7 @@ ms.service: azure-api-management
5
5
author: dlepow
6
6
ms.author: danlep
7
7
ms.topic: how-to
8
-
ms.date: 10/07/2025
8
+
ms.date: 03/25/2026
9
9
ms.update-cycle: 180-days
10
10
ms.collection: ce-skilling-ai-copilot
11
11
ms.custom: template-how-to, build-2024
@@ -24,7 +24,7 @@ Learn more about managing AI APIs in API Management:
24
24
25
25
## Client compatibility options
26
26
27
-
API Management supports two client compatibility options for AI APIs from Microsoft Foundry. When you import the API using the wizard, choose the option suitable for your model deployment. The option determines how clients call the API and how the API Management instance routes requests to the Foundry tool.
27
+
API Management supports the following client compatibility options for AI APIs from Microsoft Foundry. When you import the API using the wizard, choose the option suitable for your model deployment. The option determines how clients call the API and how the API Management instance routes requests to the Foundry tool.
28
28
29
29
***Azure OpenAI** - Manage Azure OpenAI in Microsoft Foundry model deployments.
30
30
@@ -34,27 +34,35 @@ API Management supports two client compatibility options for AI APIs from Micros
34
34
35
35
Clients call the deployment at a `/models` endpoint such as `/my-model/models/chat/completions`. Deployment name is passed in the request body. Use this option if you want flexibility to switch between models exposed through the Azure AI Model Inference API and those deployed in Azure OpenAI in Foundry Models.
36
36
37
+
***Azure OpenAI v1** - Manage Azure OpenAI in Microsoft Foundry model deployments, using the [Azure OpenAI API version 1](/azure/foundry/openai/api-version-lifecycle).
38
+
39
+
Clients call the deployment at an Azure OpenAI v1 model endpoint such as `my-model/models/chat/completions`. Deployment name is passed in the request body.
40
+
37
41
## Prerequisites
38
42
39
43
- An existing API Management instance. [Create one if you haven't already](get-started-create-service-instance.md).
40
44
41
45
- A Foundry tool in your subscription with one or more models deployed. Examples include models deployed in Microsoft Foundry or Azure OpenAI.
42
46
47
+
- If you want to enable semantic caching for the API, see [Enable semantic caching of responses](azure-openai-enable-semantic-caching.md) for prerequisites.
48
+
49
+
- If you want to enforce content safety checks on the API, see [Enforce content safety checks on LLM requests](llm-content-safety-policy.md) for prerequisites.
50
+
43
51
## Import Microsoft Foundry API using the portal
44
52
45
53
Use the following steps to import an AI API to API Management.
46
54
47
55
When you import the API, API Management automatically configures:
48
56
49
-
* Operations for each of the API's REST API endpoints
57
+
* Operations for each of the API's REST API endpoints.
50
58
* A system-assigned identity with the necessary permissions to access the Foundry tool deployment.
51
59
* A [backend](backends.md) resource and a [set-backend-service](set-backend-service-policy.md) policy that direct API requests to the Azure AI Services endpoint.
52
60
* Authentication to the backend using the instance's system-assigned managed identity.
53
61
* (optionally) Policies to help you monitor and manage the API.
54
62
55
63
To import a Microsoft Foundry API to API Management:
56
64
57
-
1. In the [Azure portal](https://portal.azure.com), navigate to your API Management instance.
65
+
1. In the [Azure portal](https://portal.azure.com), go to your API Management instance.
58
66
1. In the left menu, under **APIs**, select **APIs** > **+ Add API**.
59
67
1. Under **Create from Azure resource**, select **Microsoft Foundry**.
60
68
@@ -68,9 +76,10 @@ To import a Microsoft Foundry API to API Management:
68
76
1. Enter a **Display name** and optional **Description** for the API.
69
77
1. In **Base path**, enter a path that your API Management instance uses to access the deployment endpoint.
70
78
1. Optionally select one or more **Products** to associate with the API.
71
-
1. In **Client compatibility**, select either of the following based on the types of client you intend to support. See [Client compatibility options](#client-compatibility-options) for more information.
79
+
1. In **Client compatibility**, select one of the following based on the types of client you intend to support. See [Client compatibility options](#client-compatibility-options) for more information.
72
80
***Azure OpenAI** - Select this option if your clients only need to access Azure OpenAI in Microsoft Foundry model deployments.
73
-
***Azure AI** - Select this option if your clients need to access other models in Microsoft Foundry.
81
+
***Azure AI** - Select this option if your clients need to access other models in Microsoft Foundry.
82
+
***Azure OpenAI v1** - Select this option if you want to use the Azure OpenAI API version 1 with your Foundry model deployments.
74
83
1. Select **Next**.
75
84
76
85
:::image type="content" source="media/azure-ai-foundry-api/client-compatibility.png" alt-text="Screenshot of Microsoft Foundry API configuration in the portal.":::
0 commit comments