Skip to content

Latest commit

 

History

History
226 lines (159 loc) · 10 KB

File metadata and controls

226 lines (159 loc) · 10 KB
title Tutorial: Durable text analysis with a mounted Azure Files share in Azure Functions
description Learn how to deploy a Python Azure Functions app that uses Durable Functions to orchestrate parallel text file analysis by using a mounted Azure Files share on a Flex Consumption plan.
ms.service azure-functions
ms.topic tutorial
ms.date 03/24/2026
ms.custom
devx-track-azurecli
devx-track-azdevcli
devx-track-python

Tutorial: Durable text analysis with a mounted Azure Files share

In this tutorial, you deploy a Python Azure Functions app that uses Durable Functions to orchestrate parallel text file analysis. Your function app mounts an Azure Files share, analyzes multiple text files in parallel (fan-out), aggregates the results (fan-in), and returns them to the caller. This approach demonstrates a key advantage of storage mounts: shared file access across multiple function instances without per-request network overhead.

In this tutorial, you:

[!div class="checklist"]

  • Use Azure Developer CLI to deploy a Durable Functions app in a Flex Consumption plan with a mounted Azure Files share
  • Trigger an orchestration to process sample text files in parallel
  • Verify the aggregated analysis results

[!INCLUDE functions-azure-files-samples-note]

Prerequisites

The CLI examples in this tutorial use Bash syntax and have been tested in Azure Cloud Shell (Bash) and Linux/macOS terminals.

Initialize the sample project

You can find the sample code for this tutorial in the Azure Functions Flex Consumption with Azure Files OS Mount Samples GitHub repository. The durable-text-analysis folder contains the function app code, a Bicep template that provisions the required Azure resources, and a post-deployment script that uploads sample text files.

  1. Open a terminal and go to the directory where you want to clone the repository.

  2. Clone the repository:

    git clone https://github.com/Azure-Samples/Azure-Functions-Flex-Consumption-with-Azure-Files-OS-Mount-Samples.git
  3. Go to the project folder:

    cd Azure-Functions-Flex-Consumption-with-Azure-Files-OS-Mount-Samples/durable-text-analysis
  4. Initialize the azd environment. When prompted, enter an environment name such as durable-text:

    azd init

Review the code

The three key pieces that make this sample work are the infrastructure that creates the mount, the script that uploads sample files, and the function code that orchestrates the analysis.

The mounts.bicep module configures an Azure Files SMB mount on the function app. The mountPath value determines the local path where files appear at runtime. You pass the storage account access key as a parameter, and the platform resolves it at runtime through a Key Vault reference:

:::code language="bicep" source="~/functions-flex-azure-files-samples/durable-text-analysis/infra/app/mounts.bicep" :::

Because Azure Files SMB mounts don't yet support managed identity authentication, you need a storage account key. As a best practice, store this key in Azure Key Vault and use a Key Vault reference in an app setting. The mount configuration references that app setting by using @AppSettingRef(), so the key never appears in your Bicep templates. The keyvault.bicep module creates the vault, stores the key, and grants RBAC roles:

:::code language="bicep" source="~/functions-flex-azure-files-samples/durable-text-analysis/infra/app/keyvault.bicep" :::

The main.bicep file invokes the mount and Key Vault modules:

:::code language="bicep" source="~/functions-flex-azure-files-samples/durable-text-analysis/infra/main.bicep" range="173-206" :::

After azd up deploys the infrastructure and code, a post-deployment script creates sample text files, uploads them to the Azure Files share, and runs a health check:

:::code language="bash" source="~/functions-flex-azure-files-samples/durable-text-analysis/scripts/post-up.sh" range="33-88" :::

The HTTP starter in function_app.py starts a Durable Functions orchestration. The orchestrator in orchestrator.py lists all .txt files on the mount, fans out to analyze each file in parallel, and aggregates the results:

:::code language="python" source="~/functions-flex-azure-files-samples/durable-text-analysis/src/orchestrator.py" :::

Each activity function reads directly from the mounted share by using standard file I/O. It doesn't need any SDK or network calls:

:::code language="python" source="~/functions-flex-azure-files-samples/durable-text-analysis/src/activities.py" range="30-51" :::


Deploy by using Azure Developer CLI

This sample is an Azure Developer CLI (azd) template. A single azd up command provisions infrastructure, deploys the function code, and uploads sample text files to the Azure Files share.

  1. Sign in to Azure. The post-deployment script uses Azure CLI commands, so you need to authenticate by using both tools:

    azd auth login
    az login
  2. Provision and deploy everything:

    azd up

    When prompted, select the Azure subscription and location to use. The command then:

    • Creates a resource group, storage account, Key Vault, Flex Consumption function app with a Durable Functions configuration, Application Insights instance, and managed identity
    • Deploys the Python function code
    • Uploads sample text files to the Azure Files share
    • Runs a health check

    [!NOTE] Because Azure Files SMB mounts don't yet support managed identity authentication, you need a storage account key. As a best practice, the deployment stores this key in Azure Key Vault and uses a Key Vault reference so the key is never exposed in app settings. This approach provides centralized secret management, auditing, and support for key rotation.

    The deployment takes a few minutes. When it completes, you see a summary of the created resources.

  3. Save resource names as shell variables for the remaining steps:

    RESOURCE_GROUP=$(azd env get-value AZURE_RESOURCE_GROUP)
    FUNCTION_APP_NAME=$(azd env get-value AZURE_FUNCTION_APP_NAME)
    FUNCTION_APP_URL=$(azd env get-value AZURE_FUNCTION_APP_URL)

Trigger the orchestration

  1. Get the function host key:

    HOST_KEY=$(az functionapp keys list \
      --resource-group $RESOURCE_GROUP \
      --name $FUNCTION_APP_NAME \
      --query "functionKeys.default" \
      -o tsv)
    
  2. Start the orchestration:

    curl -s -X POST "${FUNCTION_APP_URL}/api/start-analysis?code=${HOST_KEY}" | jq .

    The response includes an instance ID and status query URIs:

    {
      "id": "abc123def456",
      "statusQueryGetUri": "https://...",
      "sendEventPostUri": "https://...",
      "terminatePostUri": "https://..."
    }

Verify results

  1. Check orchestration status. Use the statusQueryGetUri from the previous response, or construct the URL manually:

    INSTANCE_ID="<instance-id-from-trigger-response>"
    
    curl -s "${FUNCTION_APP_URL}/api/orchestrators/TextAnalysisOrchestrator/${INSTANCE_ID}?code=${HOST_KEY}" | jq .

    While the orchestration is running, the runtimeStatus is Running. When complete, the response looks like:

    {
      "name": "TextAnalysisOrchestrator",
      "instanceId": "abc123def456",
      "runtimeStatus": "Completed",
      "output": {
        "results": [
          {
            "file": "sample1.txt",
            "word_count": 15,
            "char_count": 98,
            "sentiment": "positive"
          },
          {
            "file": "sample2.txt",
            "word_count": 18,
            "char_count": 120,
            "sentiment": "positive"
          },
          {
            "file": "sample3.txt",
            "word_count": 12,
            "char_count": 85,
            "sentiment": "neutral"
          }
        ],
        "total_words": 45,
        "total_chars": 303,
        "analysis_duration_seconds": 2.34
      }
    }

Tip

Your function app accesses all three files in parallel through the storage mount. The app doesn't need any per-request network calls. The function reads them directly from the mounted share by using standard file I/O. This approach demonstrates the power of storage mounts combined with Durable Functions.

Clean up resources

To avoid ongoing charges, delete all the resources created by this tutorial:

azd down --purge

Warning

This command deletes the resource group and all resources in it, including the function app, storage account, and Application Insights instance.

Related content