Skip to content

Commit d7b9403

Browse files
authored
Merge pull request #312438 from JKirsch1/foundry-rebrand-batch-11
[SCOPED] Foundry branding
2 parents d32d95a + 21aaa37 commit d7b9403

16 files changed

Lines changed: 19 additions & 19 deletions

File tree

articles/azure-resource-manager/bicep/bicep-mcp-server.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -20,7 +20,7 @@ The Bicep MCP (Model Context Protocol) server provides AI agents with tools to h
2020
* **`list_avm_metadata`** - Lists metadata for all Azure Verified Modules (AVM).
2121
* **`list_az_resource_types_for_provider`** - Lists all Azure resource types for a specific provider, such as Microsoft.Storage.
2222

23-
Use the Bicep MCP server directly in [Visual Studio Code](#visual-studio-code). You can also run it locally with [MCP-compatible services](#integration-with-other-foundry-tools).
23+
Use the Bicep MCP server directly in [Visual Studio Code](#visual-studio-code). You can also run it locally with [MCP-compatible services](#integration-with-other-ai-services).
2424

2525
## Limitations
2626

@@ -35,7 +35,7 @@ There's no way to definitively guarantee whether the agent orchestrator uses any
3535

3636
The Bicep MCP server is available starting with Visual Studio Code Bicep extension version 0.40.2. For more information about installing, managing, and using Bicep MCP server from VS Code, see [Bicep MCP server](./visual-studio-code.md#bicep-mcp-server).
3737

38-
## Integration with other Foundry Tools
38+
## Integration with other AI services
3939

4040
You can run the Azure Bicep MCP server locally for Claude Desktop and Code, OpenAI Codex CLI, and LMStudio where you can use it with various models.
4141

articles/communication-services/concepts/ai.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -57,5 +57,5 @@ Similar to Azure Communication Services messaging, there are REST APIs for many
5757
- **[Call Automation REST APIs and SDKs](../concepts/call-automation/call-automation.md)**: Services and AI applications use Call Automation REST APIs to answer, route, and manage all types of Azure voice and video calls.
5858
- **[Service-to-service audio streaming](../concepts/call-automation/audio-streaming-concept.md)**: AI applications use Azure's service-to-service WebSockets API to stream audio data. This works in both directions, your AI can listen to a call, and speak.
5959
- **[Service-to-service real-time transcription](../concepts/call-automation/real-time-transcription.md)**: AI applications use Azure's service-to-service WebSockets API to stream a real-time, Azure-generated transcription. Compared to audio or video content, transcript data is often easier for AI models to reason upon.
60-
- **[Call recording](../concepts/voice-video-calling/call-recording.md)**: You can record Azure calls in your own datastore and then direct Foundry Tools to process that content.
60+
- **[Call recording](../concepts/voice-video-calling/call-recording.md)**: You can record Azure calls in your own datastore and then direct the AI service to process that content.
6161
- **[Client raw audio and video](../concepts/voice-video-calling/media-access.md)**: The Calling client SDK provides APIs for accessing and modifying the raw audio and video feed. An example scenario is taking the video feed, using computer vision to distinguish the human speaker from their background, and customizing that background.

articles/communication-services/how-tos/call-automation/includes/audio-streaming-quickstart-csharp.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -159,7 +159,7 @@ When DTMF is enabled Azure Communication Services sends a `DtmfData` type.
159159
## Sending audio streaming data to Azure Communication Services
160160
If bidirectional streaming is enabled using the `EnableBidirectional` flag in the `MediaStreamingOptions`, you can stream audio data back to Azure Communication Services, which plays the audio into the call.
161161

162-
Once Azure Communication Services begins streaming audio to your WebSocket server, you can relay the audio to your Foundry Tools. After your Foundry Tool processes the audio content, you can stream the audio back to the ongoing call in Azure Communication Services.
162+
Once Azure Communication Services begins streaming audio to your WebSocket server, you can relay the audio to your AI services. After your AI service processes the audio content, you can stream the audio back to the ongoing call in Azure Communication Services.
163163

164164
The example demonstrates how another service, such as Azure OpenAI or other voice-based Large Language Models, processes and transmits the audio data back into the call.
165165

articles/communication-services/how-tos/call-automation/includes/audio-streaming-quickstart-java.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -143,7 +143,7 @@ When DTMF is enabled Azure Communication Services sends a `DtmfData` type.
143143
## Sending audio streaming data to Azure Communication Services
144144
If bidirectional streaming is enabled using the `EnableBidirectional` flag in the `MediaStreamingOptions`, you can stream audio data back to Azure Communication Services, which plays the audio into the call.
145145

146-
Once Azure Communication Services begins streaming audio to your WebSocket server, you can relay the audio to your Foundry Tools. After your Foundry Tool processes the audio content, you can stream the audio back to the ongoing call in Azure Communication Services.
146+
Once Azure Communication Services begins streaming audio to your WebSocket server, you can relay the audio to your AI services. After your AI service processes the audio content, you can stream the audio back to the ongoing call in Azure Communication Services.
147147

148148
The example demonstrates how another service, such as Azure OpenAI or other voice-based Large Language Models, processes and transmits the audio data back into the call.
149149

articles/communication-services/how-tos/call-automation/includes/audio-streaming-quickstart-js.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -184,7 +184,7 @@ When DTMF is enabled Azure Communication Services sends a `DtmfData` type.
184184
## Sending audio streaming data to Azure Communication Services
185185
If bidirectional streaming is enabled using the `EnableBidirectional` flag in the `MediaStreamingOptions`, you can stream audio data back to Azure Communication Services, which plays the audio into the call.
186186

187-
Once Azure Communication Services begins streaming audio to your WebSocket server, you can relay the audio to your Foundry Tools. After your Foundry Tool processes the audio content, you can stream the audio back to the ongoing call in Azure Communication Services.
187+
Once Azure Communication Services begins streaming audio to your WebSocket server, you can relay the audio to your AI services. After your AI service processes the audio content, you can stream the audio back to the ongoing call in Azure Communication Services.
188188

189189
The example demonstrates how another service, such as Azure OpenAI or other voice-based Large Language Models, processes and transmits the audio data back into the call.
190190

articles/communication-services/how-tos/call-automation/includes/audio-streaming-quickstart-python.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -156,7 +156,7 @@ When DTMF is enabled Azure Communication Services sends a `DtmfData` type.
156156
## Sending audio streaming data to Azure Communication Services
157157
If bidirectional streaming is enabled using the `EnableBidirectional` flag in the `MediaStreamingOptions`, you can stream audio data back to Azure Communication Services, which plays the audio into the call.
158158

159-
Once Azure Communication Services begins streaming audio to your WebSocket server, you can relay the audio to your Foundry Tools. After your Foundry Tool processes the audio content, you can stream the audio back to the ongoing call in Azure Communication Services.
159+
Once Azure Communication Services begins streaming audio to your WebSocket server, you can relay the audio to your AI services. After your AI service processes the audio content, you can stream the audio back to the ongoing call in Azure Communication Services.
160160

161161
The example demonstrates how another service, such as Azure OpenAI or other voice-based Large Language Models, processes and transmits the audio data back into the call.
162162

articles/confidential-computing/confidential-ai.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,7 +1,7 @@
11
---
22
title: Confidential AI
33
titleSuffix: Azure Confidential Computing
4-
description: Confidential Foundry Tools and solutions
4+
description: Confidential AI services and solutions
55
services: virtual-machines
66
author: kapilv
77
ms.service: azure-confidential-computing

articles/connectors/built-in.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -362,13 +362,13 @@ Azure Logic Apps provides the following built-in actions for working with data o
362362
[![Data Operations icon][data-operations-icon]][data-operations-doc]
363363
<br><br>[**Data Operations**][data-operations-doc]
364364
<br><br>Perform operations with data.
365-
<br><br>**Chunk text**: Split up content into pieces to use in AI solutions or with Foundry Tool operations such as [Azure OpenAI and Azure AI Search operations](../logic-apps/connectors/azure-ai.md). For more information, see [Parse or chunk content](../logic-apps/parse-document-chunk-text.md).
365+
<br><br>**Chunk text**: Split up content into pieces to use in AI solutions or with Foundry Tools operations such as [Azure OpenAI and Azure AI Search operations](../logic-apps/connectors/azure-ai.md). For more information, see [Parse or chunk content](../logic-apps/parse-document-chunk-text.md).
366366
<br><br>**Compose**: Create a single output from multiple inputs with various types.
367367
<br><br>**Create CSV table**: Create a comma-separated-value (CSV) table from an array with JSON objects.
368368
<br><br>**Create HTML table**: Create an HTML table from an array with JSON objects.
369369
<br><br>**Filter array**: Create an array from items in another array that meet your criteria.
370370
<br><br>**Join**: Create a string from all items in an array and separate those items with the specified delimiter.
371-
<br><br>**Parse a document**: Create a tokenized string to use in AI solutions or with Foundry Tool operations such as [Azure OpenAI and Azure AI Search operations](../logic-apps/connectors/azure-ai.md). For more information, see [Parse or chunk content](../logic-apps/parse-document-chunk-text.md).
371+
<br><br>**Parse a document**: Create a tokenized string to use in AI solutions or with Foundry Tools operations such as [Azure OpenAI and Azure AI Search operations](../logic-apps/connectors/azure-ai.md). For more information, see [Parse or chunk content](../logic-apps/parse-document-chunk-text.md).
372372
<br><br>**Parse JSON**: Create user-friendly tokens from properties and their values in JSON content so that you can use those properties in your workflow.
373373
<br><br>**Select**: Create an array with JSON objects by transforming items or values in another array and mapping those items to specified properties.
374374
:::column-end:::

articles/digital-twins/overview.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -23,7 +23,7 @@ Take advantage of your domain expertise on top of Azure Digital Twins to build c
2323
* Connect assets such as IoT devices and existing business systems, using a robust event system to build dynamic business logic and data processing
2424
* Query the live execution environment to extract real-time insights from your twin graph
2525
* Build connected 3D visualizations of your environment that display business logic and twin data in context
26-
* Query historized environment data and integrate with other Azure data, analytics, and Foundry Tools to better track the past and predict the future
26+
* Query historized environment data and integrate with other Azure data, analytics, and AI services to better track the past and predict the future
2727

2828
## Define your business environment
2929

articles/digital-twins/quickstart-azure-digital-twins-explorer.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -226,7 +226,7 @@ The intent of this exercise is to demonstrate how you can use the Azure Digital
226226

227227
In this quickstart, you made the temperature update manually. It's common in Azure Digital Twins to connect digital twins to real IoT devices so that they receive updates automatically, based on device data. You can also [connect other data sources](concepts-data-ingress-egress.md#data-ingress), integrating data from different systems and defining your own logic for how twins are updated. In this way, you can build a live graph that always reflects the real state of your environment. You can use queries to get information about what's happening in your environment in real time.
228228

229-
You can also export Azure Digital Twins data to historical tracking, data analytics, and Foundry Tools to enable greater insights and perform environment simulations. Integrating Azure Digital Twins into your IoT solutions can help you more effectively track the past, control the present, and predict the future.
229+
You can also export Azure Digital Twins data to historical tracking, data analytics, and AI services to enable greater insights and perform environment simulations. Integrating Azure Digital Twins into your IoT solutions can help you more effectively track the past, control the present, and predict the future.
230230

231231
## Clean up resources
232232

0 commit comments

Comments
 (0)