Skip to content

Commit 35c666c

Browse files
authored
Merge pull request #312406 from dlepow/llmf
[APIM] Freshness updates - AI content
2 parents 4ce8555 + e7430c9 commit 35c666c

4 files changed

Lines changed: 59 additions & 59 deletions

File tree

articles/api-management/amazon-bedrock-passthrough-llm-api.md

Lines changed: 18 additions & 18 deletions
Original file line numberDiff line numberDiff line change
@@ -5,7 +5,7 @@ ms.service: azure-api-management
55
author: dlepow
66
ms.author: danlep
77
ms.topic: how-to
8-
ms.date: 07/07/2025
8+
ms.date: 02/26/2026
99
ms.update-cycle: 180-days
1010
ms.collection: ce-skilling-ai-copilot
1111
ms.custom: template-how-to, build-2024
@@ -15,79 +15,79 @@ ms.custom: template-how-to, build-2024
1515

1616
[!INCLUDE [api-management-availability-all-tiers](../../includes/api-management-availability-all-tiers.md)]
1717

18-
In this article, you import an Amazon Bedrock language model API into your API Management instance as a passthrough API. This is an example of a model that's hosted on an inference provider other than Azure AI services. Use AI gateway policies and other capabilities in API Management to simplify integration, improve observability, and enhance control over the model endpoints.
18+
In this article, you import an Amazon Bedrock language model API into your API Management instance as a passthrough API. This example shows a model that's hosted on an inference provider other than Azure AI services. Use AI gateway policies and other capabilities in API Management to simplify integration, improve observability, and enhance control over the model endpoints.
1919

20-
Learn more about managing AI APIs in API Management:
20+
For more information about managing AI APIs in API Management, see:
2121

2222
* [AI gateway capabilities in Azure API Management](genai-gateway-capabilities.md)
2323
* [Import a language model API](openai-compatible-llm-api.md)
2424

25-
Learn more about Amazon Bedrock:
25+
For more information about Amazon Bedrock, see:
2626

2727
* [What is Amazon Bedrock?](https://docs.aws.amazon.com/bedrock/latest/userguide/what-is-bedrock.html)
2828

2929

3030
## Prerequisites
3131

3232
- An existing API Management instance. [Create one if you haven't already](get-started-create-service-instance.md).
33-
- An Amazon Web Services (AWS) account with access to Amazon Bedrock, and access to one or more Amazon Bedrock foundation models. [Learn more](https://docs.aws.amazon.com/bedrock/latest/userguide/getting-started-console.html)
33+
- An Amazon Web Services (AWS) account with access to Amazon Bedrock, and access to one or more Amazon Bedrock foundation models. [Learn more](https://docs.aws.amazon.com/bedrock/latest/userguide/getting-started-console.html).
3434

3535
## Create IAM user access keys
3636

3737
To authenticate your API Management instance to Amazon API Gateway, you need access keys for an AWS IAM user.
3838

39-
To generate the required access key ID and secret key using the AWS Management Console, see [Create an access key for yourself](https://docs.aws.amazon.com/IAM/latest/UserGuide/access-key-self-managed.html#Using_CreateAccessKey) in the AWS documentation.
39+
To generate the required access key ID and secret key by using the AWS Management Console, see [Create an access key for yourself](https://docs.aws.amazon.com/IAM/latest/UserGuide/access-key-self-managed.html#Using_CreateAccessKey) in the AWS documentation.
4040

4141
Save your access keys in a safe location. You'll store them as named values in the next step.
4242

4343
> [!CAUTION]
44-
> Access keys are long-term credentials, and you should manage them as securely as you would a password. Learn more about [securing access keys](https://docs.aws.amazon.com/IAM/latest/UserGuide/securing_access-keys.html)
44+
> Access keys are long-term credentials, and you should manage them as securely as you would a password. Learn more about [securing access keys](https://docs.aws.amazon.com/IAM/latest/UserGuide/securing_access-keys.html).
4545
4646
## Store IAM user access keys as named values
4747

48-
Securely store the two IAM user access keys as secret [named values](api-management-howto-properties.md) in your Azure API Management instance using the configuration recommended in the following table.
48+
Securely store the two IAM user access keys as secret [named values](api-management-howto-properties.md) in your Azure API Management instance by using the configuration recommended in the following table.
4949

5050

5151
| AWS secret | Name | Secret value |
5252
|------------|----------------|------|--------------|
5353
| Access key | *accesskey* | Access key ID retrieved from AWS |
5454
| Secret access key | *secretkey* | Secret access key retrieved from AWS |
5555

56-
## Import a Bedrock API using the portal
56+
## Import a Bedrock API by using the portal
5757

5858
To import an Amazon Bedrock API to API Management:
5959

60-
1. In the [Azure portal](https://portal.azure.com), navigate to your API Management instance.
60+
1. In the [Azure portal](https://portal.azure.com), go to your API Management instance.
6161
1. In the left menu, under **APIs**, select **APIs** > **+ Add API**.
6262
1. Under **Define a new API**, select **Language Model API**.
6363

6464
:::image type="content" source="media/openai-compatible-llm-api/openai-api.png" alt-text="Screenshot of creating a passthrough language model API in the portal." :::
6565

6666
1. On the **Configure API** tab:
67-
1. Enter a **Display name** and optional **Description** for the API.
67+
1. Enter a **Display name** and optional **Description**.
6868
1. Enter the following **URL** to the default Amazon Bedrock endpoint: `https://bedrock-runtime.<aws-region>.amazonaws.com`.
6969

7070
Example: `https://bedrock-runtime.us-east-1.amazonaws.com`
71-
1. Optionally select one or more **Products** to associate with the API.
71+
1. Optionally, select one or more **Products** to associate with the API.
7272
1. In **Path**, append a path that your API Management instance uses to access the LLM API endpoints.
7373
1. In **Type**, select **Create a passthrough API**.
74-
1. Leave values in **Access key** blank.
74+
1. Leave **Access key** blank.
7575

7676
:::image type="content" source="media/amazon-bedrock-passthrough-llm-api/configure-api.png" alt-text="Screenshot of language model API configuration in the portal.":::
7777

7878
1. On the remaining tabs, optionally configure policies to manage token consumption, semantic caching, and AI content safety. For details, see [Import a language model API](openai-compatible-llm-api.md).
7979
1. Select **Review**.
80-
1. After settings are validated, select **Create**.
80+
1. After the portal validates the settings, select **Create**.
8181

8282
API Management creates the API and (optionally) policies to help you monitor and manage the API.
8383

8484
## Configure policies to authenticate requests to the Amazon Bedrock API
8585

86-
Configure API Management policies to sign requests to the Amazon Bedrock API. [Learn more about signing AWS API requests](https://docs.aws.amazon.com/IAM/latest/UserGuide/reference_sigv.html)
86+
Configure API Management policies to sign requests to the Amazon Bedrock API. [Learn more about signing AWS API requests](https://docs.aws.amazon.com/IAM/latest/UserGuide/reference_sigv.html).
8787

88-
The following example uses the *accesskey* and *secretkey* named values you created previously for the AWS access key and secret key. Set the `region` variable to the appropriate value for your Amazon Bedrock API. The example uses `us-east-1` for the region.
88+
The following example uses the *accesskey* and *secretkey* named values you created previously for the AWS access key and secret key. Set the `region` variable to the appropriate value for your Amazon Bedrock API. The example uses `us-east-1`.
8989

90-
1. In the [Azure portal](https://portal.azure.com), navigate to your API Management instance.
90+
1. In the [Azure portal](https://portal.azure.com), go to your API Management instance.
9191
1. In the left menu, under **APIs**, select **APIs**.
9292
1. Select the API that you created in the previous section.
9393
1. In the left menu, under **Design**, select **All operations**.
@@ -259,7 +259,7 @@ The following example uses the *accesskey* and *secretkey* named values you crea
259259

260260
## Call the Bedrock API
261261

262-
To call the Bedrock API through API Management, you can use the AWS Bedrock SDK. This example uses the .NET SDK, but you can use any language that supports the AWS Bedrock API.
262+
To call the Bedrock API through API Management, you can use the AWS Bedrock SDK. This example uses the .NET SDK, but use any language that supports the AWS Bedrock API.
263263

264264
The following example uses a custom HTTP client that instantiates classes defined in the accompanying file `BedrockHttpClientFactory.cs`. The custom HTTP client routes requests to the API Management endpoint and includes the API Management subscription key (if necessary) in the request headers.
265265

7.87 KB
Loading

articles/api-management/openai-compatible-google-gemini-api.md

Lines changed: 9 additions & 9 deletions
Original file line numberDiff line numberDiff line change
@@ -5,7 +5,7 @@ ms.service: azure-api-management
55
author: dlepow
66
ms.author: danlep
77
ms.topic: how-to
8-
ms.date: 07/06/2025
8+
ms.date: 02/26/2026
99
ms.collection: ce-skilling-ai-copilot
1010
ms.update-cycle: 180-days
1111
ms.custom: template-how-to
@@ -15,7 +15,7 @@ ms.custom: template-how-to
1515

1616
[!INCLUDE [api-management-availability-all-tiers](../../includes/api-management-availability-all-tiers.md)]
1717

18-
This article shows you how to import an OpenAI-compatible Google Gemini API to access models such as `gemini-2.0-flash`. For these models, Azure API Management can manage an OpenAI-compatible chat completions endpoint.
18+
This article shows you how to import an OpenAI-compatible Google Gemini API to access models such as `gemini-2.5-flash-lite`. For these models, Azure API Management can manage an OpenAI-compatible chat completions endpoint.
1919

2020
Learn more about managing AI APIs in API Management:
2121

@@ -28,9 +28,9 @@ Learn more about managing AI APIs in API Management:
2828
- An API key for the Gemini API. If you don't have one, create it at [Google AI Studio](https://aistudio.google.com/apikey) and store it in a safe location.
2929

3030

31-
## Import an OpenAI-compatible Gemini API using the portal
31+
## Import an OpenAI-compatible Gemini API by using the portal
3232

33-
1. In the [Azure portal](https://portal.azure.com), navigate to your API Management instance.
33+
1. In the [Azure portal](https://portal.azure.com), go to your API Management instance.
3434
1. In the left menu, under **APIs**, select **APIs** > **+ Add API**.
3535
1. Under **Define a new API**, select **Language Model API**.
3636

@@ -50,12 +50,12 @@ Learn more about managing AI APIs in API Management:
5050

5151
1. On the remaining tabs, optionally configure policies to manage token consumption, semantic caching, and AI content safety. For details, see [Import a language model API](openai-compatible-llm-api.md).
5252
1. Select **Review**.
53-
1. After settings are validated, select **Create**.
53+
1. After the portal validates the settings, select **Create**.
5454

5555
API Management creates the API and configures the following:
5656

5757
* A [backend](backends.md) resource and a [set-backend-service](set-backend-service-policy.md) policy that direct API requests to the Google Gemini endpoint.
58-
* Access to the LLM backend using the Gemini API key you provided. The key is protected as a secret [named value](api-management-howto-properties.md) in API Management.
58+
* Access to the LLM backend by using the Gemini API key you provided. API Management protects the key as a secret [named value](api-management-howto-properties.md).
5959
* (optionally) Policies to help you monitor and manage the API.
6060

6161
### Test Gemini model
@@ -65,11 +65,11 @@ After importing the API, you can test the chat completions endpoint for the API.
6565
1. Select the API that you created in the previous step.
6666
1. Select the **Test** tab.
6767
1. Select the `POST Creates a model response for the given chat conversation` operation, which is a `POST` request to the `/chat/completions` endpoint.
68-
1. In the **Request body** section, enter the following JSON to specify the model and an example prompt. In this example, the `gemini-2.0-flash` model is used.
68+
1. In the **Request body** section, enter the following JSON to specify the model and an example prompt. In this example, the `gemini-2.5-flash-lite` model is used.
6969

7070
```json
7171
{
72-
"model": "gemini-2.0-flash",
72+
"model": "gemini-2.5-flash-lite",
7373
"messages": [
7474
{
7575
"role": "system",
@@ -84,7 +84,7 @@ After importing the API, you can test the chat completions endpoint for the API.
8484
}
8585
```
8686

87-
When the test is successful, the backend responds with a successful HTTP response code and some data. Appended to the response is token usage data to help you monitor and manage your language model token consumption.
87+
When the test succeeds, the backend responds with a successful HTTP response code and some data. The response includes token usage data to help you monitor and manage your language model token consumption.
8888

8989
:::image type="content" source="media/openai-compatible-google-gemini-api/gemini-test.png" alt-text="Screenshot of testing a Gemini LLM API in the portal.":::
9090

0 commit comments

Comments
 (0)