Skip to content

Commit ea21663

Browse files
committed
changes updated
2 parents 3a7dfe4 + d341c3f commit ea21663

272 files changed

Lines changed: 3802 additions & 1567 deletions

File tree

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

articles/api-center/TOC.yml

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -56,6 +56,8 @@
5656
href: register-discover-mcp-server.md
5757
- name: Export API from API Center to Copilot Studio
5858
href: export-to-copilot-studio.yml
59+
- name: Track API dependencies
60+
href: track-resource-dependencies.md
5961
- name: API governance
6062
items:
6163
- name: Use metadata for governance

articles/api-center/configure-environments-deployments.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -101,4 +101,4 @@ In this tutorial, you learned how to use the portal to:
101101
## Related content
102102

103103
* [Learn more about Azure API Center](key-concepts.md)
104-
104+
* [Track API resource dependencies](track-resource-dependencies.md)

articles/api-center/includes/api-center-service-limits.md

Lines changed: 15 additions & 14 deletions
Original file line numberDiff line numberDiff line change
@@ -7,26 +7,27 @@ author: dlepow
77

88
ms.service: azure-api-center
99
ms.topic: include
10-
ms.date: 05/19/2025
10+
ms.date: 08/22/2025
1111
ms.author: danlep
1212
ms.custom: Include file
1313
---
1414

1515
| Resource | Free plan<sup>1</sup> | Standard plan<sup>2</sup> |
1616
| ---------------------------------------------------------------------- | -------------------------- |-------------|
17-
| Maximum number of APIs | 200 | 10,000 |
18-
| Maximum number of versions per API | 5 | 100 |
19-
| Maximum number of definitions per version | 5 | 5 |
20-
| Maximum number of deployments per API | 10 | 10 |
21-
| Maximum number of environments | 20 | 20 |
22-
| Maximum number of workspaces | 1 (Default) | 1 (Default) |
23-
| Maximum number of custom metadata properties per entity<sup>3</sup> | 10 | 20 |
24-
| Maximum number of child properties in custom metadata property of type "object" | 10 |10 |
25-
| Maximum requests per minute (data plane) | 3,000 | 6,000 |
26-
| Maximum number of API definitions [analyzed](../enable-managed-api-analysis-linting.md) | 10 | 2,000<sup>4</sup> |
27-
| Maximum number of analysis profiles | 1 | 3 |
28-
| Maximum number of linked API sources<sup>5</sup> | 1 | 5 |
29-
| Maximum number of APIs synchronized from a linked API source | 200 | 2,000<sup>4</sup> |
17+
| APIs | 200 | 10,000 |
18+
| Versions per API | 5 | 100 |
19+
| Definitions per version | 5 | 5 |
20+
| Deployments per API | 10 | 10 |
21+
| Environments | 20 | 20 |
22+
| Workspaces | 1 (Default) | 1 (Default) |
23+
| Custom metadata properties per entity<sup>3</sup> | 10 | 20 |
24+
| Child properties in custom metadata property of type "object" | 10 |10 |
25+
| Requests per minute (data plane) | 3,000 | 6,000 |
26+
| API definitions [analyzed](../enable-managed-api-analysis-linting.md) | 10 | 2,000<sup>4</sup> |
27+
| Analysis profiles | 1 | 3 |
28+
| Linked (integrated) API sources<sup>5</sup> | 1 | 5 |
29+
| APIs synchronized from a linked API source | 200 | 2,000<sup>4</sup> |
30+
| Dependencies | 200 | 200 |
3031
| Semantic search in API Center portal | No | Yes |
3132

3233
<sup>1</sup> Use of full service features including API analysis is limited.<br/>
80 KB
Loading
181 KB
Loading
Lines changed: 88 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,88 @@
1+
---
2+
title: Track API Resource Dependencies - Azure API Center
3+
description: Learn how to track dependencies between APIs and related resources in your Azure API center.
4+
ms.service: azure-api-center
5+
ms.topic: how-to
6+
ms.date: 08/28/2025
7+
ms.author: danlep
8+
author: dlepow
9+
ms.custom:
10+
# Customer intent: As an API developer or API program manager, I want to understand the dependencies between API resources in my organization's API center.
11+
---
12+
13+
# Track API resource dependencies in your API center
14+
15+
This article explains how to track the dependencies between APIs and associated resources in your [Azure API center](overview.md). Use the dependency tracker (preview) to map dependencies across APIs, environments, and deployments in your catalog. Also track dependencies on external components including GitHub repositories and other resources in Azure and other cloud platforms.
16+
17+
Each dependency identifies a *source* resource and a related *target* resource that depends on it. By tracking dependencies between source and target resources, you can:
18+
19+
* **Troubleshoot and resolve issues** more effectively by providing visibility into the relationships between components
20+
21+
* **Improve the reliability of systems** by identifying risks such as circular dependencies or over-reliance on single points of failure
22+
23+
* **Improve effectiveness of AI agents** by using mapped dependencies for automatic discovery of valid endpoints for tasks and validating toolchain compatibility
24+
25+
> [!NOTE]
26+
> This is a preview feature and is subject to change. [Limits](/azure/azure-resource-manager/management/azure-subscription-service-limits?toc=/azure/api-center/toc.json&bc=/azure/api-center/breadcrumb/toc.json#azure-api-center-limits) apply.
27+
28+
## Prerequisites
29+
30+
* An [Azure API center](overview.md) resource in your Azure subscription.
31+
* Register one or more APIs in your API center. For instructions, see [Register APIs in your API inventory](register-apis.md).
32+
33+
## Add a dependency
34+
35+
Use the dependency tracker in the Azure portal to add a dependency.
36+
37+
To add a dependency:
38+
39+
1. In the [Azure portal](https://portal.azure.com), navigate to your API center.
40+
1. In the left menu, under **Assets**, select **Dependency tracker (preview)**.
41+
1. Select **+ Add Dependency**.
42+
1. In the **Dependency Manager** window, enter a **Title** and optionally a **Description** of the dependency.
43+
1. In **Source details**, select a **Source Type** (for example, an API or a related resource). Depending on the type, enter or select identifying information such as a name or ID.
44+
1. In **Target details**, select a **Target Type** (for example, an API or a related resource). Depending on the type, enter or select identifying information such as a name or ID.
45+
1. **Save** the dependency.
46+
47+
:::image type="content" source="media/track-resource-dependencies/create-dependency.png" alt-text="Screenshot of adding a dependency in the portal.":::
48+
49+
The dependency is added.
50+
51+
## View dependencies
52+
53+
API Center provides a default table view that lists dependencies, and a graphical view with a holistic representation. Use these views to explore the relationships between your resources.
54+
55+
To see a graphical view:
56+
57+
1. In the left menu, under **Assets**, select **Dependency tracker (preview)**.
58+
1. Select the **Graph View** tab.
59+
60+
In the graphical view, select the box representing any API center resource to see its details.
61+
62+
:::image type="content" source="media/track-resource-dependencies/view-dependency-graph.png" alt-text="Screenshot of the dependency graph in the portal.":::
63+
64+
## Manage dependencies
65+
66+
You can edit or delete dependencies as needed using the table view.
67+
68+
To view or edit dependency details:
69+
70+
1. In the left menu, under **Assets**, select **Dependency tracker (preview)**.
71+
1. Select **Table View**, and find the dependency you want to edit.
72+
1. Select **See details**.
73+
1. To make changes, select **Edit**, and update details.
74+
1. **Save** your changes.
75+
76+
To delete a dependency:
77+
78+
1. In the **Dependency tracker (preview)** table view, find the dependency that you want to delete.
79+
1. Select **Delete dependency** (trash can icon).
80+
1. Confirm the deletion.
81+
82+
> [!NOTE]
83+
> If you delete an API Center resource that is a source or target in a dependency, the dependency isn't automatically deleted. You must delete it yourself.
84+
85+
## Related content
86+
87+
* [Overview of Azure API center](overview.md)
88+
* [Register APIs in your API inventory](register-apis.md)

articles/api-management/TOC.yml

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -252,6 +252,8 @@
252252
href: azure-openai-enable-semantic-caching.md
253253
- name: Authenticate and authorize to Azure OpenAI
254254
href: api-management-authenticate-authorize-azure-openai.md
255+
- name: Log LLM tokens, requests, and responses
256+
href: api-management-howto-llm-logs.md
255257
- name: Manage MCP servers
256258
items:
257259
- name: MCP server capabilities
Lines changed: 124 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,124 @@
1+
---
2+
title: Set Up Logging for LLM APIs in Azure API Management
3+
titleSuffix: Azure API Management
4+
description: Enable logging for LLM APIs in Azure API Management to track token usage, prompts, and completions for billing and auditing.
5+
#customer intent: As a system administrator, I want to enable logging of LLM request and response messages so that I can track API interactions for billing or auditing purposes.
6+
author: dlepow
7+
ms.service: azure-api-management
8+
ms.topic: how-to
9+
ms.date: 08/22/2025
10+
ms.author: danlep
11+
ai-usage: ai-assisted
12+
ms.collection: ce-skilling-ai-copilot
13+
ms.custom:
14+
---
15+
16+
# Log token usage, prompts, and completions for LLM APIs
17+
18+
In this article, you learn how to set up Azure Monitor logging for LLM API requests and responses in Azure API Management.
19+
20+
The API Management administrator can use LLM API request and response logs along with API Management gateway logs for scenarios such as the following:
21+
22+
* **Calculate usage for billing** - Calculate usage metrics for billing based on the number of tokens consumed by each application or API consumer (for example, segmented by subscription ID or IP address).
23+
24+
* **Inspect messages** - Inspect and analyze prompts and completions to help with debugging, auditing, and model evaluation.
25+
26+
Learn more about:
27+
28+
* [AI gateway capabilities in API Management](genai-gateway-capabilities.md)
29+
* [Monitoring API Management](monitor-api-management.md)
30+
31+
## Prerequisites
32+
- An Azure API Management instance.
33+
- A managed LLM chat completions API integrated with Azure API Management. For example, [Import an Azure AI Foundry API](azure-ai-foundry-api.md).
34+
- Access to an Azure Log Analytics workspace.
35+
- Appropriate permissions to configure diagnostic settings and access logs in API Management.
36+
37+
## Enable diagnostic setting for LLM API logs
38+
39+
Enable a diagnostic setting to log requests that the gateway processes for large language model REST APIs. For each request, Azure Monitor receives data about token usage (prompt tokens, completion tokens, and total tokens), the name of the model used, and optionally the request and response messages (prompt and completion). Large requests and responses are split into multiple log entries with sequence numbers for later reconstruction if needed.
40+
41+
The following are brief steps to enable a diagnostic setting that directs LLM API logs to a Log Analytics workspace. For more information, see [Enable diagnostic setting for Azure Monitor logs](monitor-api-management.md#enable-diagnostic-setting-for-azure-monitor-logs).
42+
43+
1. In the [Azure portal](https://portal.azure.com), navigate to your Azure API Management instance.
44+
1. In the left menu, under **Monitoring**, select **Diagnostic settings** > **+ Add diagnostic setting**.
45+
1. Configure the setting to send AI gateway logs to a Log Analytics workspace:
46+
- Under **Logs**, select **Logs related to generative AI gateway**.
47+
- Under **Destination details**, select **Send to Log Analytics workspace**.
48+
1. Review or configure other settings and make changes if needed.
49+
1. Select **Save**.
50+
51+
:::image type="content" source="media/api-management-howto-llm-logs/diagnostic-setting.png" alt-text="Screenshot of diagnostic setting for AI gateway logs in the portal.":::
52+
53+
## Enable logging of requests or responses for LLM API
54+
55+
You can enable diagnostic settings for all APIs or customize logging for specific APIs. The following are brief steps to log both LLM requests and response messages for an API. For more information, see [Modify API logging settings](monitor-api-management.md#modify-api-logging-settings).
56+
57+
1. In the left menu of your API Management instance, select **APIs > APIs** and then select the name of the API.
58+
1. Select the **Settings** tab from the top bar.
59+
1. Scroll down to the **Diagnostic Logs** section, and select the **Azure Monitor** tab.
60+
1. In **Log LLM messages**, select **Enabled**.
61+
1. Select **Log prompts** and enter a size in bytes, such as *32768*.
62+
1. Select **Log completions** and enter a size in bytes, such as *32768*.
63+
1. Review other settings and make changes if needed. Select **Save**.
64+
65+
:::image type="content" source="media/api-management-howto-llm-logs/enable-llm-api-logging.png" alt-text="Screenshot of enabling LLM logging for an API in the portal.":::
66+
67+
> [!NOTE]
68+
> If you enable collection, LLM request or response messages up to 32 KB in size are sent in a single entry. Messages larger than 32 KB are split and logged in 32 KB chunks with sequence numbers for later reconstruction. Request messages and response messages can't exceed 2 MB each.
69+
70+
71+
## Review analytics workbook for LLM APIs
72+
73+
The Azure Monitor-based **Analytics** dashboard provides insights into LLM API usage and token consumption using data aggregated in a Log Analytics workspace. [Learn more](monitor-api-management.md#get-api-analytics-in-azure-api-management) about Analytics in API Management.
74+
75+
1. In the left menu of your API Management instance, select **Monitoring** > **Analytics**.
76+
1. Select the **Language models** tab.
77+
1. Review metrics and visualizations for LLM API token consumption and requests in a selected **Time range**.
78+
79+
:::image type="content" source="media/api-management-howto-llm-logs/analytics-workbook-small.png" alt-text="Screenshot of analytics for language model APIs in the portal." lightbox="media/api-management-howto-llm-logs/analytics-workbook.png":::
80+
81+
## Review Azure Monitor logs for requests and responses
82+
83+
Review the [ApiManagementGatewayLlmLog](/azure/azure-monitor/reference/tables/apimanagementgatewayllmlog) log for details about LLM requests and responses, including token consumption, model deployment used, and other details over specific time ranges.
84+
85+
Requests and responses (including chunked messages for large requests and responses) appear in separate log entries that you can correlate by using the `CorrelationId` field.
86+
87+
For auditing purposes, use a Kusto query similar to the following query to join each request and response in a single record. Adjust the query to include the fields that you want to track.
88+
89+
```Kusto
90+
ApiManagementGatewayLlmLog
91+
| extend RequestArray = parse_json(RequestMessages)
92+
| extend ResponseArray = parse_json(ResponseMessages)
93+
| mv-expand RequestArray
94+
| mv-expand ResponseArray
95+
| project
96+
CorrelationId,
97+
RequestContent = tostring(RequestArray.content),
98+
ResponseContent = tostring(ResponseArray.content)
99+
| summarize
100+
Input = strcat_aray(make_list(RequestContent), " . "),
101+
Output = strcat_array(make_list(ResponseContent), " . ")
102+
by CorrelationId
103+
| where isnotempty(Input) and isnotempty(Output)
104+
```
105+
106+
:::image type="content" source="media/api-management-howto-llm-logs/llm-log-query-small.png" alt-text="Screenshot of query results for LLM logs in the portal." lightbox="media/api-management-howto-llm-logs/llm-log-query.png":::
107+
108+
## Upload data to Azure AI Foundry for model evaluation
109+
110+
You can export LLM logging data as a dataset for [model evaluation](/azure/ai-foundry/concepts/observability) in Azure AI Foundry. With model evaluation, you can assess the performance of your generative AI models and applications against a test model or dataset using built-in or custom evaluation metrics.
111+
112+
To use LLM logs as a dataset for model evaluation:
113+
114+
1. Join LLM request and response messages into a single record for each interaction, as shown in the [previous section](#review-azure-monitor-logs-for-requests-and-responses). Include the fields you want to use for model evaluation.
115+
1. Export the dataset to CSV format, which is compatible with Azure AI Foundry.
116+
1. In the Azure AI Foundry portal, create a new evaluation to upload and evaluate the dataset.
117+
118+
For details to create and run a model evaluation in Azure AI Foundry, see [Evaluate generative AI models and applications by using Azure AI Foundry](/azure/ai-foundry/how-to/evaluate-generative-ai-app).
119+
120+
## Related content
121+
122+
* [Learn more about monitoring API Management](monitor-api-management.md)
123+
* [Azure Monitor reference for API Management](monitor-api-management-reference.md)
124+
* [Tutorial: Monitor published APIs](api-management-howto-use-azure-monitor.md)

articles/api-management/genai-gateway-capabilities.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -109,14 +109,14 @@ In API Management, enable semantic caching by using Azure Redis Enterprise, Azur
109109
110110
## Logging token usage, prompts, and completions
111111

112-
Enable a [diagnostic setting](monitor-api-management.md#enable-diagnostic-setting-for-azure-monitor-logs) in your API Management instance to log requests processed by the gateway for large language model REST APIs. For each request, data is sent to Azure Monitor including token usage (prompt tokens, completion tokens, and total tokens), name of the model used, and optionally the request and response messages (prompt and completion). Large requests and responses are split into multiple log entries that are sequentially numbered for later reconstruction if needed.
112+
You can enable logging for requests processed by the gateway for large language model REST APIs. For each request, data is sent to Azure Monitor including token usage (prompt tokens, completion tokens, and total tokens), name of the model used, and optionally the request and response messages (prompt and completion). Large requests and responses are split into multiple log entries that are sequentially numbered for later reconstruction if needed.
113113

114114
The API Management administrator can use LLM gateway logs along with API Management gateway logs for scenarios such as the following:
115115

116116
* **Calculate usage for billing** - Calculate usage metrics for billing based on the number of tokens consumed by each application or API consumer (for example, segmented by subscription ID or IP address).
117117
* **Inspect messages** - To help with debugging or auditing, inspect and analyze prompts and completions.
118118

119-
Learn more about [monitoring API Management with Azure Monitor](monitor-api-management.md).
119+
Learn more: [Log token usage, prompts, and completions for LLM APIs](api-management-howto-llm-logs.md)
120120

121121
## Content safety policy
122122

@@ -135,7 +135,7 @@ To help safeguard users from harmful, offensive, or misleading content, you can
135135

136136
* [AI gateway reference architecture using API Management](/ai/playbook/technology-guidance/generative-ai/dev-starters/genai-gateway/reference-architectures/apim-based)
137137
* [AI hub gateway landing zone accelerator](https://github.com/Azure-Samples/ai-hub-gateway-solution-accelerator)
138-
* [Designing and implementing a gateway solution with Azure OpenAI resources](/ai/playbook/technology-guidance/generative-ai/dev-starters/genai-gateway/)
138+
* [Designing and implementing a gateway solution with Azure OpenAI resources](/ai/playbook/technology-guidance/generative-ai/dev-starters/gemonitoring API Management with Azurenai-gateway/)
139139
* [Use a gateway in front of multiple Azure OpenAI deployments or instances](/azure/architecture/ai-ml/guide/azure-openai-gateway-multi-backend)
140140

141141
## Related content
112 KB
Loading

0 commit comments

Comments
 (0)