| title | Respond to blob storage events using Azure Functions |
|---|---|
| description | Learn how to use the Azure Developer CLI (azd) to create resources and deploy a local project to a Flex Consumption plan on Azure Functions. The project features a Blob Storage trigger that runs in response to blob storage events. |
| ms.date | 12/02/2025 |
| ms.topic | quickstart |
| zone_pivot_groups | programming-languages-set-functions |
In this quickstart, you use Visual Studio Code to build an app that responds to events in a Blob Storage container. After testing the code locally by using an emulator, you deploy it to a new serverless function app running in a Flex Consumption plan in Azure Functions.
The project uses the Azure Developer CLI (azd) extension with Visual Studio Code to simplify initializing and verifying your project code locally, as well as deploying your code to Azure. This deployment follows current best practices for secure and scalable Azure Functions deployments.
::: zone pivot="programming-language-javascript,programming-language-typescript"
This article supports version 4 of the Node.js programming model for Azure Functions.
::: zone-end
::: zone pivot="programming-language-python"
This article supports version 2 of the Python programming model for Azure Functions.
::: zone-end
[!INCLUDE functions-scenario-quickstarts-prerequisites-full]
-
REST Client extension or an equivalent REST tool you use to securely execute HTTP requests.
Use the azd init command from the command palette to create a local Azure Functions code project from a template.
-
In Visual Studio Code, open a folder or workspace where you want to create your project.
-
Press F1 to open the command palette, search for and run the command
Azure Developer CLI (azd): Initialize App (init), then choose Select a template.There might be a slight delay while
azdinitializes the current folder or workspace.
::: zone pivot="programming-language-csharp"
3. When prompted, choose Select a template, then search for and select Azure Functions C# Event Grid Blob Trigger using Azure Developer CLI.
-
When prompted in the terminal, enter a unique environment name, such as
blobevents-dotnet.This command pulls the project files from the template repository and initializes the project in the current folder or workspace. ::: zone-end ::: zone pivot="programming-language-python"
-
When prompted, choose Select a template, then search for and select
Azure Functions Python Event Grid Blob Trigger using Azure Developer CLI. -
When prompted in the terminal, enter a unique environment name, such as
blobevents-python.This command pulls the project files from the template repository and initializes the project in the current folder or workspace. ::: zone-end ::: zone pivot="programming-language-javascript,programming-language-typescript"
-
When prompted, choose Select a template, then search for and select
Azure Functions TypeScript Event Grid Blob Trigger using Azure Developer CLI. -
When prompted, enter a unique environment name, such as
blobevents-typescript.This command pulls the project files from the template repository and initializes the project in the current folder or workspace. ::: zone-end ::: zone pivot="programming-language-java"
-
When prompted, choose Select a template, then search for and select
Azure Functions Java Event Grid Blob Trigger using Azure Developer CLI. -
When prompted, enter a unique environment name, such as
blobevents-java.This command pulls the project files from the template repository and initializes the project in the current folder or workspace. ::: zone-end ::: zone pivot="programming-language-powershell"
-
When prompted, choose Select a template, then search for and select
Azure Functions PowerShell Event Grid Blob Trigger using Azure Developer CLI. -
When prompted, enter a unique environment name, such as
blobevents-powershell.This command pulls the project files from the template repository and initializes the project in the current folder or workspace. ::: zone-end
In azd, the environment maintains a unique deployment context for your app, and you can define more than one. It's also part of the name of the resource group you create in Azure.
Functions needs the local.settings.json file to configure the host when running locally.
-
Run this command to go to the
srcapp folder:cd src
::: zone pivot="programming-language-csharp"
2. Create a file named local.settings.json in the src folder that contains this JSON data:
```json
{
"IsEncrypted": false,
"Values": {
"AzureWebJobsStorage": "UseDevelopmentStorage=true",
"FUNCTIONS_WORKER_RUNTIME": "dotnet-isolated",
"PDFProcessorSTORAGE": "UseDevelopmentStorage=true"
}
}
```
::: zone-end
::: zone pivot="programming-language-java"
2. Create a file named local.settings.json in the src folder that contains this JSON data:
```json
{
"IsEncrypted": false,
"Values": {
"AzureWebJobsStorage": "UseDevelopmentStorage=true",
"FUNCTIONS_WORKER_RUNTIME": "java",
"PDFProcessorSTORAGE": "UseDevelopmentStorage=true"
}
}
```
::: zone-end
::: zone pivot="programming-language-javascript,programming-language-typescript"
2. Create a file named local.settings.json in the src folder that contains this JSON data:
```json
{
"IsEncrypted": false,
"Values": {
"AzureWebJobsStorage": "UseDevelopmentStorage=true",
"FUNCTIONS_WORKER_RUNTIME": "node",
"PDFProcessorSTORAGE": "UseDevelopmentStorage=true"
}
}
```
::: zone-end
::: zone pivot="programming-language-powershell"
2. Create a file named local.settings.json in the src folder that contains this JSON data:
```json
{
"IsEncrypted": false,
"Values": {
"AzureWebJobsStorage": "UseDevelopmentStorage=true",
"FUNCTIONS_WORKER_RUNTIME": "powershell",
"FUNCTIONS_WORKER_RUNTIME_VERSION": "7.2",
"PDFProcessorSTORAGE": "UseDevelopmentStorage=true"
}
}
```
::: zone-end
::: zone pivot="programming-language-python"
2. Create a file named local.settings.json in the src folder that contains this JSON data:
```json
{
"IsEncrypted": false,
"Values": {
"AzureWebJobsStorage": "UseDevelopmentStorage=true",
"FUNCTIONS_WORKER_RUNTIME": "python",
"PDFProcessorSTORAGE": "UseDevelopmentStorage=true"
}
}
```
In the src folder, run these commands to create and activate a virtual environment named .venv:
python3 -m venv .venv
source .venv/bin/activateIf Python doesn't install the venv package on your Linux distribution, run the following command:
sudo apt-get install python3-venvpy -m venv .venv
source .venv/scripts/activatepy -m venv .venv
.venv\scripts\activate::: zone-end
Use the Azurite emulator to run your code project locally before creating and using Azure resources.
-
If you haven't already, install Azurite.
-
Press F1. In the command palette, search for and run the command
Azurite: Startto start the local storage emulator. -
In the Azure area, expand Workspace > Attached Storage Accounts > Local Emulator, right-click (Ctrl-click on Mac) Blob Containers, select Create Blob Container..., and create these two blob storage containers in the local emulator:
unprocessed-pdf: container that the trigger monitors for storage events.processed-pdf: container where the function sends processed blobs as output.
-
Expand Blob Containers, right-click (Ctrl-click on Mac) unprocessed-pdf, select Upload Files..., press Enter to accept the root directory, and upload the PDF files from the
dataproject folder.
When running locally, you can use REST to trigger the function by simulating the function receiving a message from an event subscription.
Visual Studio Code integrates with Azure Functions Core tools to let you run this project on your local development computer by using the Azurite emulator. The PDFProcessorSTORAGE environment variable defines the storage account connection, which is also set to "UseDevelopmentStorage=true" in the local.settings.json file when running locally.
-
Run this command from the
srcproject folder in a terminal or command prompt:::: zone pivot="programming-language-csharp, programming-language-powershell,programming-language-python"
func start::: zone-end
::: zone pivot="programming-language-java"mvn clean package mvn azure-functions:run
::: zone-end
::: zone pivot="programming-language-javascript"npm install func start
::: zone-end
::: zone pivot="programming-language-typescript"npm install npm start
::: zone-end
When the Functions host starts, it writes the name of the trigger and the trigger type to the terminal output. In Functions, the project root folder contains the host.json file.
-
With Core Tools still running in Terminal, open the
test.httpfile in your project and select Send Request to trigger theProcessBlobUploadfunction by sending a test blob event to the blob event webhook.This step simulates receiving an event from an event subscription when running locally, and you should see the request and processed file information written in the logs. If you aren't using REST Client, you must use another secure REST tool to call the endpoint with the payload in
test.http. -
In the Workspace area for the blob container, expand processed-pdf and verify that the function processed the PDF file and copied it with a
processed-prefix. -
When you're done, press Ctrl+C in the terminal window to stop the
func.exehost process.
::: zone pivot="programming-language-csharp" You can review the code that defines the Event Grid blob trigger in the ProcessBlobUpload.cs project file. The function demonstrates how to:
-
Use
BlobTriggerwithSource = BlobTriggerSource.EventGridfor near real-time processing -
Bind to
BlobClientfor the source blob andBlobContainerClientfor the destination -
Process blob content and copy it to another container by using streams ::: zone-end ::: zone pivot="programming-language-python" You can review the code that defines the Event Grid blob trigger in the function_app.py project file. The function demonstrates how to:
-
Use
@app.blob_triggerwithsource="EventGrid"for near real-time processing -
Access blob content using the
InputStreamparameter -
Copy processed files to the destination container using the Azure Storage SDK ::: zone-end ::: zone pivot="programming-language-javascript,programming-language-typescript" You can review the code that defines the Event Grid blob trigger in the processBlobUpload.ts project file. The function demonstrates how to:
-
Use
app.storageBlob()withsource: 'EventGrid'for near real-time processing -
Access blob content using the Node.js Azure Storage SDK
-
Process and copy files to the destination container asynchronously ::: zone-end ::: zone pivot="programming-language-java" You can review the code that defines the Event Grid blob trigger in the ProcessBlobUpload.java project file. The function demonstrates how to:
-
Use
@BlobTriggerwithsource = "EventGrid"for near real-time processing -
Access blob content using
BlobInputStreamparameter -
Copy processed files to the destination container using Azure Storage SDK for Java ::: zone-end ::: zone pivot="programming-language-powershell" You can review the code that defines the Event Grid blob trigger in the ProcessBlobUpload/run.ps1 project file and the corresponding function.json. The function demonstrates how to:
-
Configure blob trigger with
"source": "EventGrid"in function.json for near real-time processing -
Access blob content using PowerShell Azure Storage cmdlets
-
Process and copy files to the destination container using Azure PowerShell modules ::: zone-end
After you review and verify your function code locally, it's time to publish the project to Azure.
Use the azd up command to create the function app in a Flex Consumption plan along with other required Azure resources, including the event subscription. After the infrastructure is ready, azd also deploys your project code to the new function app in Azure.
-
In Visual Studio Code, press F1 to open the command palette. Search for and run the command
Azure Developer CLI (azd): Sign In with Azure Developer CLI, then sign in by using your Azure account. -
In the project root, press F1 to open the command palette. Search for and run the command
Azure Developer CLI (azd): Provision and Deploy (up)to create the required Azure resources and deploy your code. -
When prompted in the Terminal window, provide these required deployment parameters:
Prompt Description Select an Azure Subscription to use Choose the subscription in which you want to create your resources. Environment name An environment that's used to maintain a unique deployment context for your app. Azure location Azure region in which to create the resource group that contains the new Azure resources. Only regions that currently support the Flex Consumption plan are shown. The
azd upcommand uses your responses to these prompts with the Bicep configuration files to create and configure these required Azure resources, following the latest best practices:- Flex Consumption plan and function app
- Azure Storage account with blob containers
- Application Insights (recommended)
- Access policies and roles for your account
- Event Grid subscription for blob events
- Service-to-service connections by using managed identities (instead of stored connection strings)
After the command completes successfully, your app runs in Azure with an event subscription configured to trigger your function when blobs are added to the
unprocessed-pdfcontainer. -
Make a note of the
AZURE_STORAGE_ACCOUNT_NAMEandAZURE_FUNCTION_APP_NAMEin the output. These names are unique for your storage account and function app in Azure, respectively.
-
In Visual Studio Code, press F1. In the command palette, search for and run the command
Azure Storage: Upload Files.... Accept the root directory, and as before, upload one or more PDF files from thedataproject folder. -
When prompted, select the name of your new storage account (from
AZURE_STORAGE_ACCOUNT_NAME). Select Blob Containers > unprocessed-pdf. -
Press F1. In the command palette, search for and run the command
Azure Storage: Open in Explorer. Select the same storage account > Blob Containers > processed-pdf, then Open in new window. -
In the Explorer, verify that the PDF files you uploaded were processed by your function. The output is written to the
processed-pdfcontainer with aprocessed-prefix.
The Event Grid blob trigger processes files within seconds of upload. This speed demonstrates the near real-time capabilities of this approach compared to traditional polling-based blob triggers.
[!INCLUDE functions-scenario-redeploy-cleanup]