You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
We recommend that you use key vault certificates because doing so improves API Management security:
40
39
41
40
* Certificates stored in key vaults can be reused across services.
@@ -49,14 +48,11 @@ We recommend that you use key vault certificates because doing so improves API M
49
48
* If you haven't created an API Management instance yet, see [Create an API Management service instance](get-started-create-service-instance.md).
50
49
* Configure your backend service client certificate authentication. For information about configuring certificate authentication in Azure App Service, see [Configure TLS mutual authentication in App Service][to configure certificate authentication in Azure WebSites refer to this article].
51
50
* Ensure that you have access to the certificate and the password for management in an Azure key vault, or a certificate to upload to the API Management service. The certificate must be in PFX format. Self-signed certificates are allowed.
52
-
* If you use a self-signed certificate or other custom CA certificate and your API Management instance is in one of the classic tiers, install the corresponding root and intermediate CA certificates in API Management to enable validation of the backend service certificate. For more information, see [How to add a custom CA certificate in Azure API Management](api-management-howto-ca-certificates.md)
53
-
54
-
If you don't install the CA certificates, API Management can't validate the backend service certificate, and requests to the backend service fail unless you disable certificate chain validation. See [Disable certificate chain validation for self-signed certificates](#disable-certificate-chain-validation-for-self-signed-certificates) later in this article.
* If you use a self-signed certificate and your API Management instance is in one of the classic tiers, disable certificate chain validation. See [Disable certificate chain validation for self-signed certificates](#disable-certificate-chain-validation-for-self-signed-certificates) later in this article.
57
52
58
53
> [!NOTE]
59
-
> CA certificates for certificate validation aren't supported in the Consumption tier.
54
+
> When a client certificate is used by API Management for **outbound authentication** (for example, when API Management presents the certificate to a backend service), you don't need to upload the root or intermediate CA certificates to the API Management CA store. In this scenario, API Management *presents* the client certificate and doesn't perform certificate chain validation.<br/><br/>
55
+
> Uploading trusted root or intermediate CA certificates is only required when API Management must *validate* a certificate chain, such as during inbound client certificate authentication.
You can import AI model endpoints deployed in Microsoft Foundry to your API Management instance as APIs. Use AI gateway policies and other capabilities in API Management to simplify integration, improve observability, and enhance control over the model endpoints.
19
19
20
-
Learn more about managing AI APIs in API Management:
20
+
To learn more about managing AI APIs in API Management, see:
21
21
22
22
*[AI gateway capabilities in Azure API Management](genai-gateway-capabilities.md)
23
23
24
24
## Client compatibility options
25
25
26
-
API Management supports two client compatibility options for AI APIs from Microsoft Foundry. When you import the API using the wizard, choose the option suitable for your model deployment. The option determines how clients call the API and how the API Management instance routes requests to the Foundry tool.
26
+
API Management supports the following client compatibility options for AI APIs from Microsoft Foundry. When you import the API by using the wizard, choose the option suitable for your model deployment. The option determines how clients call the API and how the API Management instance routes requests to the Foundry tool.
27
27
28
28
***Azure OpenAI**: Manage Azure OpenAI in Microsoft Foundry model deployments.
29
29
30
-
Clients call the deployment at an `/openai` endpoint such as `/openai/deployments/my-deployment/chat/completions`. Deployment name is passed in the request path. Use this option if your Foundry tool only includes Azure OpenAI model deployments.
30
+
Clients call the deployment at an `/openai` endpoint such as `/openai/deployments/my-deployment/chat/completions`. The request path includes the deployment name. Use this option if your Foundry tool only includes Azure OpenAI model deployments.
31
31
32
32
***Azure AI**: Manage model endpoints in Microsoft Foundry that are exposed through the [Azure AI Model Inference API](/rest/api/aifoundry/modelinference/).
33
33
34
-
Clients call the deployment at a `/models` endpoint such as `/my-model/models/chat/completions`. Deployment name is passed in the request body. Use this option if you want flexibility to switch between models exposed through the Azure AI Model Inference API and those deployed in Azure OpenAI in Foundry Models.
34
+
Clients call the deployment at a `/models` endpoint such as `/my-model/models/chat/completions`. The request body includes the deployment name. Use this option if you want flexibility to switch between models exposed through the Azure AI Model Inference API and those deployed in Azure OpenAI in Foundry Models.
35
+
36
+
***Azure OpenAI v1** - Manage Azure OpenAI in Microsoft Foundry model deployments, using the [Azure OpenAI API version 1 API](/azure/foundry/openai/api-version-lifecycle).
37
+
38
+
Clients call the deployment at an Azure OpenAI v1 model endpoint such as `openai/v1/my-model/chat/completions`. The request body includes the deployment name.
35
39
36
40
## Prerequisites
37
41
38
42
* An existing API Management instance. [Create one if you haven't already](get-started-create-service-instance.md).
39
43
40
44
* A Foundry tool in your subscription with one or more models deployed. Examples include models deployed in Microsoft Foundry or Azure OpenAI.
41
45
42
-
## Import Microsoft Foundry API using the portal
46
+
- If you want to enable semantic caching for the API, see [Enable semantic caching of responses](azure-openai-enable-semantic-caching.md) for prerequisites.
47
+
48
+
- If you want to enforce content safety checks on the API, see [Enforce content safety checks on LLM requests](llm-content-safety-policy.md) for prerequisites.
49
+
50
+
## Import Microsoft Foundry API by using the portal
43
51
44
52
Use the following steps to import an AI API to API Management.
45
53
@@ -48,28 +56,29 @@ When you import the API, API Management automatically configures:
48
56
* Operations for each of the API's REST API endpoints.
49
57
* A system-assigned identity with the necessary permissions to access the Foundry tool deployment.
50
58
* A [backend](backends.md) resource and a [set-backend-service](set-backend-service-policy.md) policy that direct API requests to the Azure AI Services endpoint.
51
-
* Authentication to the backend using the instance's system-assigned managed identity.
59
+
* Authentication to the backend by using the instance's system-assigned managed identity.
52
60
* (optionally) Policies to help you monitor and manage the API.
53
61
54
62
To import a Microsoft Foundry API to API Management:
55
63
56
-
1. In the [Azure portal](https://portal.azure.com), navigate to your API Management instance.
64
+
1. In the [Azure portal](https://portal.azure.com), go to your API Management instance.
57
65
1. In the left menu, under **APIs**, select **APIs** > **+ Add API**.
58
66
1. Under **Create from Azure resource**, select **Microsoft Foundry**.
59
67
60
68
:::image type="content" source="media/azure-ai-foundry-api/ai-foundry-api.png" alt-text="Screenshot of creating an OpenAI-compatible API in the portal." :::
61
69
1. On the **Select AI Service** tab:
62
70
1. Select the **Subscription** in which to search for Foundry Tools. To get information about the model deployments in a service, select the **deployments** link next to the service name.
63
-
:::image type="content" source="media/azure-ai-foundry-api/deployments.png" alt-text="Screenshot of deployments for an AI service in the portal.":::
64
-
1. Select a Foundry tool.
71
+
:::image type="content" source="media/azure-ai-foundry-api/deployments.png" alt-text="Screenshot of deployments for an AI service in the portal." lightbox="media/azure-ai-foundry-api/deployments.png":::
72
+
1. Select a Foundry tool.
65
73
1. Select **Next**.
66
74
1. On the **Configure API** tab:
67
75
1. Enter a **Display name** and optional **Description** for the API.
68
76
1. In **Base path**, enter a path that your API Management instance uses to access the deployment endpoint.
69
-
1. Optionally, select one or more **Products** to associate with the API.
70
-
1. In **Client compatibility**, select either of the following based on the types of client you intend to support. See [Client compatibility options](#client-compatibility-options) for more information.
71
-
***Azure OpenAI**: Select this option if your clients only need to access Azure OpenAI in Microsoft Foundry model deployments.
72
-
***Azure AI**: Select this option if your clients need to access other models in Microsoft Foundry.
77
+
1. Optionally select one or more **Products** to associate with the API.
78
+
1. In **Client compatibility**, select one of the following options based on the types of client you intend to support. See [Client compatibility options](#client-compatibility-options) for more information.
79
+
***Azure OpenAI** - Select this option if your clients only need to access Azure OpenAI in Microsoft Foundry model deployments.
80
+
***Azure AI** - Select this option if your clients need to access other models in Microsoft Foundry.
81
+
***Azure OpenAI v1** - Select this option if you want to use the Azure OpenAI API version 1 with your Foundry model deployments.
73
82
1. Select **Next**.
74
83
75
84
:::image type="content" source="media/azure-ai-foundry-api/client-compatibility.png" alt-text="Screenshot of Microsoft Foundry API configuration in the portal.":::
@@ -79,15 +88,14 @@ To import a Microsoft Foundry API to API Management:
1. On the **Apply semantic caching** tab, optionally enter settings, or accept defaults that define the policies to help optimize performance and reduce latency for the API:
81
90
*[Enable semantic caching of responses](azure-openai-enable-semantic-caching.md)
82
-
1. On the **AI content safety**, optionally enter settings, or accept defaults to configure the Azure AI Content Safety service to block prompts with unsafe content:
91
+
1. On the **AI content safety** tab, optionally enter settings or accept defaults to configure the Azure AI Content Safety service to block prompts with unsafe content:
83
92
*[Enforce content safety checks on LLM requests](llm-content-safety-policy.md)
84
93
1. Select **Review**.
85
-
1. After settings are validated, select **Create**.
94
+
1. After the portal validates the settings, select **Create**.
86
95
87
96
## Test the AI API
88
97
89
-
To ensure that your AI API is working as expected, test it in the API Management test console.
90
-
98
+
To make sure your AI API works as expected, test it in the API Management test console.
91
99
1. Select the API you created in the previous step.
92
100
1. Select the **Test** tab.
93
101
1. Select an operation that's compatible with the model deployment.
@@ -108,9 +116,9 @@ To ensure that your AI API is working as expected, test it in the API Management
108
116
```
109
117
110
118
> [!NOTE]
111
-
> In the test console, API Management automatically populates an **Ocp-Apim-Subscription-Key** header, and configures the subscription key of the built-in [all-access subscription](api-management-subscriptions.md#all-access-subscription). This key enables access to every API in the API Management instance. Optionally display the **Ocp-Apim-Subscription-Key** header by selecting the "eye" icon next to the **HTTP Request**.
119
+
> In the test console, API Management automatically adds an **Ocp-Apim-Subscription-Key** header and sets the subscription key for the built-in [all-access subscription](api-management-subscriptions.md#all-access-subscription). This key provides access to every API in the API Management instance. To optionally display the **Ocp-Apim-Subscription-Key** header, select the "eye" icon next to the **HTTP Request**.
112
120
1. Select **Send**.
113
121
114
-
When the test is successful, the backend responds with a successful HTTP response code and some data. Appended to the response is token usage data to help you monitor and manage your language model token consumption.
122
+
When the test is successful, the backend responds with a successful HTTP response code and some data. The response includes token usage data to help you monitor and manage your language model token consumption.
0 commit comments