Skip to content

Commit 551f9ed

Browse files
authored
Merge pull request #313690 from dlepow/dlepow-1774470084282
[APIM] Update azure-ai-foundry-api article
2 parents 53f69a2 + c9a8acc commit 551f9ed

3 files changed

Lines changed: 28 additions & 20 deletions

File tree

articles/api-management/azure-ai-foundry-api.md

Lines changed: 28 additions & 20 deletions
Original file line numberDiff line numberDiff line change
@@ -5,7 +5,7 @@ ms.service: azure-api-management
55
author: dlepow
66
ms.author: danlep
77
ms.topic: how-to
8-
ms.date: 03/24/2026
8+
ms.date: 03/30/2026
99
ms.update-cycle: 180-days
1010
ms.collection: ce-skilling-ai-copilot
1111
ms.custom: template-how-to, build-2024
@@ -17,29 +17,37 @@ ms.custom: template-how-to, build-2024
1717

1818
You can import AI model endpoints deployed in Microsoft Foundry to your API Management instance as APIs. Use AI gateway policies and other capabilities in API Management to simplify integration, improve observability, and enhance control over the model endpoints.
1919

20-
Learn more about managing AI APIs in API Management:
20+
To learn more about managing AI APIs in API Management, see:
2121

2222
* [AI gateway capabilities in Azure API Management](genai-gateway-capabilities.md)
2323

2424
## Client compatibility options
2525

26-
API Management supports two client compatibility options for AI APIs from Microsoft Foundry. When you import the API using the wizard, choose the option suitable for your model deployment. The option determines how clients call the API and how the API Management instance routes requests to the Foundry tool.
26+
API Management supports the following client compatibility options for AI APIs from Microsoft Foundry. When you import the API by using the wizard, choose the option suitable for your model deployment. The option determines how clients call the API and how the API Management instance routes requests to the Foundry tool.
2727

2828
* **Azure OpenAI**: Manage Azure OpenAI in Microsoft Foundry model deployments.
2929

30-
Clients call the deployment at an `/openai` endpoint such as `/openai/deployments/my-deployment/chat/completions`. Deployment name is passed in the request path. Use this option if your Foundry tool only includes Azure OpenAI model deployments.
30+
Clients call the deployment at an `/openai` endpoint such as `/openai/deployments/my-deployment/chat/completions`. The request path includes the deployment name. Use this option if your Foundry tool only includes Azure OpenAI model deployments.
3131

3232
* **Azure AI**: Manage model endpoints in Microsoft Foundry that are exposed through the [Azure AI Model Inference API](/rest/api/aifoundry/modelinference/).
3333

34-
Clients call the deployment at a `/models` endpoint such as `/my-model/models/chat/completions`. Deployment name is passed in the request body. Use this option if you want flexibility to switch between models exposed through the Azure AI Model Inference API and those deployed in Azure OpenAI in Foundry Models.
34+
Clients call the deployment at a `/models` endpoint such as `/my-model/models/chat/completions`. The request body includes the deployment name. Use this option if you want flexibility to switch between models exposed through the Azure AI Model Inference API and those deployed in Azure OpenAI in Foundry Models.
35+
36+
* **Azure OpenAI v1** - Manage Azure OpenAI in Microsoft Foundry model deployments, using the [Azure OpenAI API version 1 API](/azure/foundry/openai/api-version-lifecycle).
37+
38+
Clients call the deployment at an Azure OpenAI v1 model endpoint such as `openai/v1/my-model/chat/completions`. The request body includes the deployment name.
3539

3640
## Prerequisites
3741

3842
* An existing API Management instance. [Create one if you haven't already](get-started-create-service-instance.md).
3943

4044
* A Foundry tool in your subscription with one or more models deployed. Examples include models deployed in Microsoft Foundry or Azure OpenAI.
4145

42-
## Import Microsoft Foundry API using the portal
46+
- If you want to enable semantic caching for the API, see [Enable semantic caching of responses](azure-openai-enable-semantic-caching.md) for prerequisites.
47+
48+
- If you want to enforce content safety checks on the API, see [Enforce content safety checks on LLM requests](llm-content-safety-policy.md) for prerequisites.
49+
50+
## Import Microsoft Foundry API by using the portal
4351

4452
Use the following steps to import an AI API to API Management.
4553

@@ -48,28 +56,29 @@ When you import the API, API Management automatically configures:
4856
* Operations for each of the API's REST API endpoints.
4957
* A system-assigned identity with the necessary permissions to access the Foundry tool deployment.
5058
* A [backend](backends.md) resource and a [set-backend-service](set-backend-service-policy.md) policy that direct API requests to the Azure AI Services endpoint.
51-
* Authentication to the backend using the instance's system-assigned managed identity.
59+
* Authentication to the backend by using the instance's system-assigned managed identity.
5260
* (optionally) Policies to help you monitor and manage the API.
5361

5462
To import a Microsoft Foundry API to API Management:
5563

56-
1. In the [Azure portal](https://portal.azure.com), navigate to your API Management instance.
64+
1. In the [Azure portal](https://portal.azure.com), go to your API Management instance.
5765
1. In the left menu, under **APIs**, select **APIs** > **+ Add API**.
5866
1. Under **Create from Azure resource**, select **Microsoft Foundry**.
5967

6068
:::image type="content" source="media/azure-ai-foundry-api/ai-foundry-api.png" alt-text="Screenshot of creating an OpenAI-compatible API in the portal." :::
6169
1. On the **Select AI Service** tab:
6270
1. Select the **Subscription** in which to search for Foundry Tools. To get information about the model deployments in a service, select the **deployments** link next to the service name.
63-
:::image type="content" source="media/azure-ai-foundry-api/deployments.png" alt-text="Screenshot of deployments for an AI service in the portal.":::
64-
1. Select a Foundry tool.
71+
:::image type="content" source="media/azure-ai-foundry-api/deployments.png" alt-text="Screenshot of deployments for an AI service in the portal." lightbox="media/azure-ai-foundry-api/deployments.png":::
72+
1. Select a Foundry tool.
6573
1. Select **Next**.
6674
1. On the **Configure API** tab:
6775
1. Enter a **Display name** and optional **Description** for the API.
6876
1. In **Base path**, enter a path that your API Management instance uses to access the deployment endpoint.
69-
1. Optionally, select one or more **Products** to associate with the API.
70-
1. In **Client compatibility**, select either of the following based on the types of client you intend to support. See [Client compatibility options](#client-compatibility-options) for more information.
71-
* **Azure OpenAI**: Select this option if your clients only need to access Azure OpenAI in Microsoft Foundry model deployments.
72-
* **Azure AI**: Select this option if your clients need to access other models in Microsoft Foundry.
77+
1. Optionally select one or more **Products** to associate with the API.
78+
1. In **Client compatibility**, select one of the following options based on the types of client you intend to support. See [Client compatibility options](#client-compatibility-options) for more information.
79+
* **Azure OpenAI** - Select this option if your clients only need to access Azure OpenAI in Microsoft Foundry model deployments.
80+
* **Azure AI** - Select this option if your clients need to access other models in Microsoft Foundry.
81+
* **Azure OpenAI v1** - Select this option if you want to use the Azure OpenAI API version 1 with your Foundry model deployments.
7382
1. Select **Next**.
7483

7584
:::image type="content" source="media/azure-ai-foundry-api/client-compatibility.png" alt-text="Screenshot of Microsoft Foundry API configuration in the portal.":::
@@ -79,15 +88,14 @@ To import a Microsoft Foundry API to API Management:
7988
* [Track token usage](llm-emit-token-metric-policy.md)
8089
1. On the **Apply semantic caching** tab, optionally enter settings, or accept defaults that define the policies to help optimize performance and reduce latency for the API:
8190
* [Enable semantic caching of responses](azure-openai-enable-semantic-caching.md)
82-
1. On the **AI content safety**, optionally enter settings, or accept defaults to configure the Azure AI Content Safety service to block prompts with unsafe content:
91+
1. On the **AI content safety** tab, optionally enter settings or accept defaults to configure the Azure AI Content Safety service to block prompts with unsafe content:
8392
* [Enforce content safety checks on LLM requests](llm-content-safety-policy.md)
8493
1. Select **Review**.
85-
1. After settings are validated, select **Create**.
94+
1. After the portal validates the settings, select **Create**.
8695

8796
## Test the AI API
8897

89-
To ensure that your AI API is working as expected, test it in the API Management test console.
90-
98+
To make sure your AI API works as expected, test it in the API Management test console.
9199
1. Select the API you created in the previous step.
92100
1. Select the **Test** tab.
93101
1. Select an operation that's compatible with the model deployment.
@@ -108,9 +116,9 @@ To ensure that your AI API is working as expected, test it in the API Management
108116
```
109117

110118
> [!NOTE]
111-
> In the test console, API Management automatically populates an **Ocp-Apim-Subscription-Key** header, and configures the subscription key of the built-in [all-access subscription](api-management-subscriptions.md#all-access-subscription). This key enables access to every API in the API Management instance. Optionally display the **Ocp-Apim-Subscription-Key** header by selecting the "eye" icon next to the **HTTP Request**.
119+
> In the test console, API Management automatically adds an **Ocp-Apim-Subscription-Key** header and sets the subscription key for the built-in [all-access subscription](api-management-subscriptions.md#all-access-subscription). This key provides access to every API in the API Management instance. To optionally display the **Ocp-Apim-Subscription-Key** header, select the "eye" icon next to the **HTTP Request**.
112120
1. Select **Send**.
113121

114-
When the test is successful, the backend responds with a successful HTTP response code and some data. Appended to the response is token usage data to help you monitor and manage your language model token consumption.
122+
When the test is successful, the backend responds with a successful HTTP response code and some data. The response includes token usage data to help you monitor and manage your language model token consumption.
115123

116124
[!INCLUDE [api-management-define-api-topics.md](../../includes/api-management-define-api-topics.md)]
7.13 KB
Loading
25.6 KB
Loading

0 commit comments

Comments
 (0)