You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
In this article, you import an Amazon Bedrock language model API into your API Management instance as a passthrough API. This is an example of a model that's hosted on an inference provider other than Azure AI services. Use AI gateway policies and other capabilities in API Management to simplify integration, improve observability, and enhance control over the model endpoints.
18
+
In this article, you import an Amazon Bedrock language model API into your API Management instance as a passthrough API. This example shows a model that's hosted on an inference provider other than Azure AI services. Use AI gateway policies and other capabilities in API Management to simplify integration, improve observability, and enhance control over the model endpoints.
19
19
20
-
Learn more about managing AI APIs in API Management:
20
+
For more information about managing AI APIs in API Management, see:
21
21
22
22
*[AI gateway capabilities in Azure API Management](genai-gateway-capabilities.md)
23
23
*[Import a language model API](openai-compatible-llm-api.md)
24
24
25
-
Learn more about Amazon Bedrock:
25
+
For more information about Amazon Bedrock, see:
26
26
27
27
*[What is Amazon Bedrock?](https://docs.aws.amazon.com/bedrock/latest/userguide/what-is-bedrock.html)
28
28
29
29
30
30
## Prerequisites
31
31
32
32
- An existing API Management instance. [Create one if you haven't already](get-started-create-service-instance.md).
33
-
- An Amazon Web Services (AWS) account with access to Amazon Bedrock, and access to one or more Amazon Bedrock foundation models. [Learn more](https://docs.aws.amazon.com/bedrock/latest/userguide/getting-started-console.html)
33
+
- An Amazon Web Services (AWS) account with access to Amazon Bedrock, and access to one or more Amazon Bedrock foundation models. [Learn more](https://docs.aws.amazon.com/bedrock/latest/userguide/getting-started-console.html).
34
34
35
35
## Create IAM user access keys
36
36
37
37
To authenticate your API Management instance to Amazon API Gateway, you need access keys for an AWS IAM user.
38
38
39
-
To generate the required access key ID and secret key using the AWS Management Console, see [Create an access key for yourself](https://docs.aws.amazon.com/IAM/latest/UserGuide/access-key-self-managed.html#Using_CreateAccessKey) in the AWS documentation.
39
+
To generate the required access key ID and secret key by using the AWS Management Console, see [Create an access key for yourself](https://docs.aws.amazon.com/IAM/latest/UserGuide/access-key-self-managed.html#Using_CreateAccessKey) in the AWS documentation.
40
40
41
41
Save your access keys in a safe location. You'll store them as named values in the next step.
42
42
43
43
> [!CAUTION]
44
-
> Access keys are long-term credentials, and you should manage them as securely as you would a password. Learn more about [securing access keys](https://docs.aws.amazon.com/IAM/latest/UserGuide/securing_access-keys.html)
44
+
> Access keys are long-term credentials, and you should manage them as securely as you would a password. Learn more about [securing access keys](https://docs.aws.amazon.com/IAM/latest/UserGuide/securing_access-keys.html).
45
45
46
46
## Store IAM user access keys as named values
47
47
48
-
Securely store the two IAM user access keys as secret [named values](api-management-howto-properties.md) in your Azure API Management instance using the configuration recommended in the following table.
48
+
Securely store the two IAM user access keys as secret [named values](api-management-howto-properties.md) in your Azure API Management instance by using the configuration recommended in the following table.
To import an Amazon Bedrock API to API Management:
59
59
60
-
1. In the [Azure portal](https://portal.azure.com), navigate to your API Management instance.
60
+
1. In the [Azure portal](https://portal.azure.com), go to your API Management instance.
61
61
1. In the left menu, under **APIs**, select **APIs** > **+ Add API**.
62
62
1. Under **Define a new API**, select **Language Model API**.
63
63
64
64
:::image type="content" source="media/openai-compatible-llm-api/openai-api.png" alt-text="Screenshot of creating a passthrough language model API in the portal." :::
65
65
66
66
1. On the **Configure API** tab:
67
-
1. Enter a **Display name** and optional **Description** for the API.
67
+
1. Enter a **Display name** and optional **Description**.
68
68
1. Enter the following **URL** to the default Amazon Bedrock endpoint: `https://bedrock-runtime.<aws-region>.amazonaws.com`.
1. Optionally select one or more **Products** to associate with the API.
71
+
1. Optionally, select one or more **Products** to associate with the API.
72
72
1. In **Path**, append a path that your API Management instance uses to access the LLM API endpoints.
73
73
1. In **Type**, select **Create a passthrough API**.
74
-
1. Leave values in **Access key** blank.
74
+
1. Leave **Access key** blank.
75
75
76
76
:::image type="content" source="media/amazon-bedrock-passthrough-llm-api/configure-api.png" alt-text="Screenshot of language model API configuration in the portal.":::
77
77
78
78
1. On the remaining tabs, optionally configure policies to manage token consumption, semantic caching, and AI content safety. For details, see [Import a language model API](openai-compatible-llm-api.md).
79
79
1. Select **Review**.
80
-
1. After settings are validated, select **Create**.
80
+
1. After the portal validates the settings, select **Create**.
81
81
82
82
API Management creates the API and (optionally) policies to help you monitor and manage the API.
83
83
84
84
## Configure policies to authenticate requests to the Amazon Bedrock API
85
85
86
-
Configure API Management policies to sign requests to the Amazon Bedrock API. [Learn more about signing AWS API requests](https://docs.aws.amazon.com/IAM/latest/UserGuide/reference_sigv.html)
86
+
Configure API Management policies to sign requests to the Amazon Bedrock API. [Learn more about signing AWS API requests](https://docs.aws.amazon.com/IAM/latest/UserGuide/reference_sigv.html).
87
87
88
-
The following example uses the *accesskey* and *secretkey* named values you created previously for the AWS access key and secret key. Set the `region` variable to the appropriate value for your Amazon Bedrock API. The example uses `us-east-1` for the region.
88
+
The following example uses the *accesskey* and *secretkey* named values you created previously for the AWS access key and secret key. Set the `region` variable to the appropriate value for your Amazon Bedrock API. The example uses `us-east-1`.
89
89
90
-
1. In the [Azure portal](https://portal.azure.com), navigate to your API Management instance.
90
+
1. In the [Azure portal](https://portal.azure.com), go to your API Management instance.
91
91
1. In the left menu, under **APIs**, select **APIs**.
92
92
1. Select the API that you created in the previous section.
93
93
1. In the left menu, under **Design**, select **All operations**.
@@ -259,7 +259,7 @@ The following example uses the *accesskey* and *secretkey* named values you crea
259
259
260
260
## Call the Bedrock API
261
261
262
-
To call the Bedrock API through API Management, you can use the AWS Bedrock SDK. This example uses the .NET SDK, but you can use any language that supports the AWS Bedrock API.
262
+
To call the Bedrock API through API Management, you can use the AWS Bedrock SDK. This example uses the .NET SDK, but use any language that supports the AWS Bedrock API.
263
263
264
264
The following example uses a custom HTTP client that instantiates classes defined in the accompanying file `BedrockHttpClientFactory.cs`. The custom HTTP client routes requests to the API Management endpoint and includes the API Management subscription key (if necessary) in the request headers.
This article shows you how to import an OpenAI-compatible Google Gemini API to access models such as `gemini-2.0-flash`. For these models, Azure API Management can manage an OpenAI-compatible chat completions endpoint.
18
+
This article shows you how to import an OpenAI-compatible Google Gemini API to access models such as `gemini-2.5-flash-lite`. For these models, Azure API Management can manage an OpenAI-compatible chat completions endpoint.
19
19
20
20
Learn more about managing AI APIs in API Management:
21
21
@@ -28,9 +28,9 @@ Learn more about managing AI APIs in API Management:
28
28
- An API key for the Gemini API. If you don't have one, create it at [Google AI Studio](https://aistudio.google.com/apikey) and store it in a safe location.
29
29
30
30
31
-
## Import an OpenAI-compatible Gemini API using the portal
31
+
## Import an OpenAI-compatible Gemini API by using the portal
32
32
33
-
1. In the [Azure portal](https://portal.azure.com), navigate to your API Management instance.
33
+
1. In the [Azure portal](https://portal.azure.com), go to your API Management instance.
34
34
1. In the left menu, under **APIs**, select **APIs** > **+ Add API**.
35
35
1. Under **Define a new API**, select **Language Model API**.
36
36
@@ -50,12 +50,12 @@ Learn more about managing AI APIs in API Management:
50
50
51
51
1. On the remaining tabs, optionally configure policies to manage token consumption, semantic caching, and AI content safety. For details, see [Import a language model API](openai-compatible-llm-api.md).
52
52
1. Select **Review**.
53
-
1. After settings are validated, select **Create**.
53
+
1. After the portal validates the settings, select **Create**.
54
54
55
55
API Management creates the API and configures the following:
56
56
57
57
* A [backend](backends.md) resource and a [set-backend-service](set-backend-service-policy.md) policy that direct API requests to the Google Gemini endpoint.
58
-
* Access to the LLM backend using the Gemini API key you provided. The key is protected as a secret [named value](api-management-howto-properties.md) in API Management.
58
+
* Access to the LLM backend by using the Gemini API key you provided. API Management protects the key as a secret [named value](api-management-howto-properties.md).
59
59
* (optionally) Policies to help you monitor and manage the API.
60
60
61
61
### Test Gemini model
@@ -65,11 +65,11 @@ After importing the API, you can test the chat completions endpoint for the API.
65
65
1. Select the API that you created in the previous step.
66
66
1. Select the **Test** tab.
67
67
1. Select the `POST Creates a model response for the given chat conversation` operation, which is a `POST` request to the `/chat/completions` endpoint.
68
-
1. In the **Request body** section, enter the following JSON to specify the model and an example prompt. In this example, the `gemini-2.0-flash` model is used.
68
+
1. In the **Request body** section, enter the following JSON to specify the model and an example prompt. In this example, the `gemini-2.5-flash-lite` model is used.
69
69
70
70
```json
71
71
{
72
-
"model": "gemini-2.0-flash",
72
+
"model": "gemini-2.5-flash-lite",
73
73
"messages": [
74
74
{
75
75
"role": "system",
@@ -84,7 +84,7 @@ After importing the API, you can test the chat completions endpoint for the API.
84
84
}
85
85
```
86
86
87
-
When the test is successful, the backend responds with a successful HTTP response code and some data. Appended to the response is token usage data to help you monitor and manage your language model token consumption.
87
+
When the test succeeds, the backend responds with a successful HTTP response code and some data. The response includes token usage data to help you monitor and manage your language model token consumption.
88
88
89
89
:::image type="content" source="media/openai-compatible-google-gemini-api/gemini-test.png" alt-text="Screenshot of testing a Gemini LLM API in the portal.":::
0 commit comments