Skip to content

Latest commit

 

History

History
217 lines (167 loc) · 13.7 KB

File metadata and controls

217 lines (167 loc) · 13.7 KB
title Use AI tools and models in Azure Functions
description Learn how Azure Functions supports AI integration in your applications, including LLMs, RAG, agentic workflows, and AI frameworks. Build scalable AI-powered serverless solutions.
ms.topic concept-article
ms.date 11/03/2025
ms.update-cycle 180-days
ai-usage ai-assisted
ms.custom
build-2025
ms.collection
ce-skilling-ai-copilot
zone_pivot_groups programming-languages-set-functions

Use AI tools and models in Azure Functions

Azure Functions provides serverless compute resources that integrate with AI and Azure services to streamline building cloud-hosted intelligent applications. This article provides a survey of the breadth of AI-related scenarios, integrations, and other AI resources that you can use in your function apps.

Consider using Azure Functions in your AI-enabled experiences for these scenarios:

Scenario Description
Tools and MCP servers Functions lets you create and host remote Model Content Protocol (MCP) servers and implement various AI tools. MCP servers are the industry standard for enabling function calling through remote tools.
Agentic workflows Durable Functions helps you create multistep, long-running agent operations with built-in fault tolerance.
Retrieval-augmented generation (RAG) RAG systems require fast data retrieval and processing. Functions can interact with multiple data sources simultaneously and provide the rapid scale required by RAG scenarios.

Select one of these scenarios to learn more in this article.

This article is language-specific, so make sure you choose your programming language at the top of the page.

Tools and MCP servers

AI models and agents use function calling to request external resources known as tools. Function calling lets models and agents dynamically invoke specific functionality based on the context of a conversation or task.

Functions is particularly well-suited for implementing function calling in agentic workflows because it efficiently scales to handle demand and provides binding extensions that simplify connecting agents with remote Azure services. When you build or host AI tools in Functions, you also get serverless pricing models and platform security features.

The Model Context Protocol (MCP) is the industry standard for interacting with remote servers. It provides a standardized way for AI models and agents to communicate with external systems. An MCP server lets these AI clients efficiently determine the tools and capabilities of an external system.

Azure Functions currently supports exposing your function code by using these types of tools:

Tool type Description
Remote MCP server Create custom MCP servers or host SDK-based MCP servers.
Queue-based Azure Functions tool Microsoft Foundry provides a specific Azure Functions tool that enables asynchronous function calling by using message queues.

Remote MCP servers

Functions supports these options for creating and hosting remote MCP servers:

  • Use the MCP binding extension to create and host custom MCP servers as you would any other function app.
  • Self host MCP servers created by using the official MCP SDKs. This hosting option is currently in preview.

Here's a comparison of the current MCP server hosting options provided by Functions:

Feature MCP binding extension Self-hosted MCP servers
Current support level GA Preview*
Programming model Functions triggers and bindings Standard MCP SDKs
Stateful execution Supported Not currently supported
Languages currently supported C# (isolated process)
Python
TypeScript
JavaScript
Java
C# (isolated process)
Python
TypeScript
Java
Other requirements None Streamable HTTP transport
How implemented MCP binding extension Custom handlers

*Configuration details for self-hosted MCP servers change during the preview.

::: zone pivot="programming-language-csharp,programming-language-java,programming-language-typescript,programming-language-python" Here are some options to help you get started hosting MCP servers in Functions:
::: zone-end
::: zone pivot="programming-language-csharp"

Options MCP binding extensions Self-hosted MCP servers
Documentation MCP binding extension n/a
Samples Remote custom MCP server Weather server
Templates HelloTool n/a

::: zone-end
::: zone pivot="programming-language-python"

Options MCP binding extensions Self-hosted MCP servers
Documentation MCP binding extensions n/a
Samples Remote custom MCP server Weather server

::: zone-end
::: zone pivot="programming-language-typescript"

Options MCP binding extensions Self-hosted MCP servers
Documentation MCP binding extensions n/a
Samples Remote custom MCP server Weather server

::: zone-end
::: zone pivot="programming-language-javascript"

Options MCP binding extensions Self-hosted MCP servers
Documentation MCP binding extensions n/a
Samples Not yet available n/a

::: zone-end
::: zone pivot="programming-language-java"

Options MCP binding extensions Self-hosted MCP servers
Documentation MCP binding extensions n/a
Samples Not yet available Not yet available

::: zone-end
::: zone pivot="programming-language-powershell"
PowerShell isn't currently supported for either MCP server hosting option.
::: zone-end

Queue-based Azure Functions tools

In addition to MCP servers, you can implement AI tools by using Azure Functions with queue-based communication. Foundry provides Azure Functions-specific tools that enable asynchronous function calling by using message queues. With these tools, AI agents interact with your code by using messaging patterns.

This tool approach is ideal for Foundry scenarios that require:

  • Reliable message delivery and processing
  • Decoupling between AI agents and function execution
  • Built-in retry and error handling capabilities
  • Integration with existing Azure messaging infrastructure

::: zone pivot="programming-language-java,programming-language-typescript,programming-language-powershell" Here are some reference samples for function calling scenarios: ::: zone-end ::: zone pivot="programming-language-csharp"
Agent Service function calling ::: zone-end ::: zone pivot="programming-language-python"
Agent Service function calling ::: zone-end ::: zone pivot="programming-language-javascript"
Agent Service function calling ::: zone-end ::: zone pivot="programming-language-csharp,programming-language-python,programming-language-javascript"

Uses a Foundry Agent Service client to call a custom remote MCP server implemented by using Azure Functions. ::: zone-end ::: zone pivot="programming-language-csharp"
Agents function calling (Azure AI SDKs) ::: zone-end ::: zone pivot="programming-language-python"
Agents function calling (Azure AI SDKs) ::: zone-end ::: zone pivot="programming-language-javascript"
Agents function calling (Azure AI SDKs) ::: zone-end ::: zone pivot="programming-language-csharp,programming-language-python,programming-language-javascript"
Uses function calling features for agents in Azure AI SDKs to implement custom function calling.
::: zone-end

Agentic workflows

AI-driven processes often determine how to interact with models and other AI assets. However, some scenarios require a higher level of predictability or well-defined steps. These directed agentic workflows orchestrate separate tasks or interactions that agents must follow.

The Durable Functions extension helps you take advantage of the strengths of Functions to create multistep, long-running operations with built-in fault tolerance. These workflows work well for your directed agentic workflows. For example, a trip planning solution might first gather requirements from the user, search for plan options, obtain user approval, and finally make required bookings. In this scenario, you can build an agent for each step and then coordinate their actions as a workflow using Durable Functions.

For more workflow scenario ideas, see Application patterns in Durable Functions.

Retrieval-augmented generation

Because Functions can handle multiple events from various data sources simultaneously, it's an effective solution for real-time AI scenarios, like RAG systems that require fast data retrieval and processing. Rapid event-driven scaling reduces the latency your customers experience, even in high-demand situations.

Here are some reference samples for RAG-based scenarios:

::: zone pivot="programming-language-csharp"
RAG with Azure AI Search ::: zone-end
::: zone pivot="programming-language-python"
RAG with Azure AI Search ::: zone-end
::: zone pivot="programming-language-java,programming-language-typescript,programming-language-powershell"
RAG with Azure AI Search ::: zone-end
::: zone pivot="programming-language-javascript"
RAG with Azure AI Search ::: zone-end

For RAG, you can use SDKs, including Azure Open AI and Azure SDKs, to build your scenarios. ::: zone-end

::: zone pivot="programming-language-csharp"
Custom chat bot ::: zone-end
::: zone pivot="programming-language-python"
Custom chat bot ::: zone-end
::: zone pivot="programming-language-java,programming-language-typescript,programming-language-powershell"
Custom chat bot ::: zone-end
::: zone pivot="programming-language-javascript"
Custom chat bot ::: zone-end
::: zone pivot="programming-language-csharp,programming-language-javascript,programming-language-python"

Shows you how to create a friendly chat bot that issues simple prompts, receives text completions, and sends messages, all in a stateful session using the OpenAI binding extension. ::: zone-end

AI tools and frameworks for Azure Functions

Functions lets you build apps in your preferred language and use your favorite libraries. Because of this flexibility, you can use a wide range of AI libraries and frameworks in your AI-enabled function apps.

Here are some key Microsoft AI frameworks you should be aware of:

Framework/library Description
Agent Framework Easily build AI agents and agentic workflows.
Agent Service A fully managed service for building, deploying, and scaling AI agents with enterprise-grade security, built-in tools, and seamless integration with Azure Functions.
Foundry Tools SDKs By working directly with client SDKs, you can use the full breadth of Foundry Tools functionality directly in your function code.

Functions also lets your apps reference third-party libraries and frameworks, so you can use all of your favorite AI tools and libraries in your AI-enabled functions.

Related article