| title | Azure Blob storage input binding for Azure Functions | |||||||
|---|---|---|---|---|---|---|---|---|
| description | Learn how to read and work with blob data from Azure Blob storage containers in your function code using an input binding. | |||||||
| ms.topic | reference | |||||||
| ms.date | 05/12/2024 | |||||||
| ms.devlang | csharp | |||||||
| zone_pivot_groups | programming-languages-set-functions | |||||||
| ms.custom |
|
The input binding allows you to read blob storage data as input to an Azure Function.
For information on setup and configuration details, see the overview.
::: zone pivot="programming-language-javascript,programming-language-typescript"
[!INCLUDE functions-nodejs-model-tabs-description]
::: zone-end
::: zone pivot="programming-language-python"
[!INCLUDE functions-python-model-tabs-description]
::: zone-end
::: zone pivot="programming-language-csharp"
[!INCLUDE functions-bindings-csharp-intro]
[!INCLUDE functions-in-process-model-retirement-note]
The following example is a C# function that runs in an isolated worker process and uses a blob trigger with both blob input and blob output blob bindings. The function is triggered by the creation of a blob in the test-samples-trigger container. It reads a text file from the test-samples-input container and creates a new text file in an output container based on the name of the triggered file.
:::code language="csharp" source="~/azure-functions-dotnet-worker/samples/Extensions/Blob/BlobFunction.cs" range="9-26":::
The following example is a C# function that uses a queue trigger and an input blob binding. The queue message contains the name of the blob, and the function logs the size of the blob.
[FunctionName("BlobInput")]
public static void Run(
[QueueTrigger("myqueue-items")] string myQueueItem,
[Blob("samples-workitems/{queueTrigger}", FileAccess.Read)] Stream myBlob,
ILogger log)
{
log.LogInformation($"BlobInput processed blob\n Name:{myQueueItem} \n Size: {myBlob.Length} bytes");
}::: zone-end ::: zone pivot="programming-language-java"
This section contains the following examples:
- HTTP trigger: look up blob name from query string
- Queue trigger: receive blob name from queue message
The following example shows a Java function that uses the HttpTrigger annotation to receive a parameter containing the name of a file in a blob storage container. The BlobInput annotation then reads the file and passes its contents to the function as a byte[].
@FunctionName("getBlobSizeHttp")
@StorageAccount("Storage_Account_Connection_String")
public HttpResponseMessage blobSize(
@HttpTrigger(name = "req",
methods = {HttpMethod.GET},
authLevel = AuthorizationLevel.ANONYMOUS)
HttpRequestMessage<Optional<String>> request,
@BlobInput(
name = "file",
dataType = "binary",
path = "samples-workitems/{Query.file}")
byte[] content,
final ExecutionContext context) {
// build HTTP response with size of requested blob
return request.createResponseBuilder(HttpStatus.OK)
.body("The size of \"" + request.getQueryParameters().get("file") + "\" is: " + content.length + " bytes")
.build();
}The following example shows a Java function that uses the QueueTrigger annotation to receive a message containing the name of a file in a blob storage container. The BlobInput annotation then reads the file and passes its contents to the function as a byte[].
@FunctionName("getBlobSize")
@StorageAccount("Storage_Account_Connection_String")
public void blobSize(
@QueueTrigger(
name = "filename",
queueName = "myqueue-items-sample")
String filename,
@BlobInput(
name = "file",
dataType = "binary",
path = "samples-workitems/{queueTrigger}")
byte[] content,
final ExecutionContext context) {
context.getLogger().info("The size of \"" + filename + "\" is: " + content.length + " bytes");
}In the Java functions runtime library, use the @BlobInput annotation on parameters whose value would come from a blob. This annotation can be used with native Java types, POJOs, or nullable values using Optional<T>.
::: zone-end
::: zone pivot="programming-language-typescript"
[!INCLUDE functions-blob-storage-sdk-types-node]
The following example shows a queue triggered TypeScript function that makes a copy of a blob. The function is triggered by a queue message that contains the name of the blob to copy. The new blob is named {originalblobname}-Copy.
:::code language="typescript" source="~/azure-functions-nodejs-v4/ts/src/functions/storageBlobInputAndOutput1.ts" :::
TypeScript samples are not documented for model v3.
::: zone-end
::: zone pivot="programming-language-javascript"
The following example shows a queue triggered JavaScript function that makes a copy of a blob. The function is triggered by a queue message that contains the name of the blob to copy. The new blob is named {originalblobname}-Copy.
:::code language="javascript" source="~/azure-functions-nodejs-v4/js/src/functions/storageBlobInputAndOutput1.js" :::
The following example shows blob input and output bindings in a function.json file and JavaScript code that uses the bindings. The function makes a copy of a blob. The function is triggered by a queue message that contains the name of the blob to copy. The new blob is named {originalblobname}-Copy.
In the function.json file, the queueTrigger metadata property is used to specify the blob name in the path properties:
{
"bindings": [
{
"queueName": "myqueue-items",
"connection": "MyStorageConnectionAppSetting",
"name": "myQueueItem",
"type": "queueTrigger",
"direction": "in"
},
{
"name": "myInputBlob",
"type": "blob",
"path": "samples-workitems/{queueTrigger}",
"connection": "MyStorageConnectionAppSetting",
"direction": "in"
},
{
"name": "myOutputBlob",
"type": "blob",
"path": "samples-workitems/{queueTrigger}-Copy",
"connection": "MyStorageConnectionAppSetting",
"direction": "out"
}
],
"disabled": false
}The configuration section explains these properties.
Here's the JavaScript code:
module.exports = async function(context) {
context.log('Node.js Queue trigger function processed', context.bindings.myQueueItem);
context.bindings.myOutputBlob = context.bindings.myInputBlob;
};::: zone-end
::: zone pivot="programming-language-powershell"
The following example shows a blob input binding, defined in the function.json file, which makes the incoming blob data available to the PowerShell function.
Here's the json configuration:
{
"bindings": [
{
"name": "InputBlob",
"type": "blobTrigger",
"direction": "in",
"path": "source/{name}",
"connection": "AzureWebJobsStorage"
}
]
}Here's the function code:
# Input bindings are passed in via param block.
param([byte[]] $InputBlob, $TriggerMetadata)
Write-Host "PowerShell Blob trigger: Name: $($TriggerMetadata.Name) Size: $($InputBlob.Length) bytes"::: zone-end
::: zone pivot="programming-language-python"
This example uses SDK types to directly access the underlying BlobClient object provided by the Blob storage input binding:
:::code language="python" source="~/functions-python-extensions/azurefunctions-extensions-bindings-blob/samples/blob_samples_blobclient/function_app.py" range="9-12,40-50":::
For examples of using other SDK types, see the ContainerClient and StorageStreamDownloader samples. For a step-by-step tutorial on how to include SDK-type bindings in your function app, follow the Python SDK Bindings for Blob Sample.
To learn more, including what other SDK type bindings are supported, see SDK type bindings.
The code creates a copy of a blob.
import logging
import azure.functions as func
app = func.FunctionApp()
@app.function_name(name="BlobOutput1")
@app.route(route="file")
@app.blob_input(arg_name="inputblob",
path="PATH/TO/BLOB",
connection="CONNECTION_SETTING")
@app.blob_output(arg_name="outputblob",
path="PATH/TO/NEW/BLOB",
connection="CONNECTION_SETTING")
def main(req: func.HttpRequest, inputblob: str, outputblob: func.Out[str]):
logging.info(f'Python Queue trigger function processed {len(inputblob)} bytes')
outputblob.set(inputblob)
return "ok"The function makes a copy of a blob. The function is triggered by a queue message that contains the name of the blob to copy. The new blob is named {originalblobname}-Copy.
In the function.json file, the queueTrigger metadata property is used to specify the blob name in the path properties:
{
"bindings": [
{
"queueName": "myqueue-items",
"connection": "MyStorageConnectionAppSetting",
"name": "queuemsg",
"type": "queueTrigger",
"direction": "in"
},
{
"name": "inputblob",
"type": "blob",
"dataType": "binary",
"path": "samples-workitems/{queueTrigger}",
"connection": "MyStorageConnectionAppSetting",
"direction": "in"
},
{
"name": "$return",
"type": "blob",
"path": "samples-workitems/{queueTrigger}-Copy",
"connection": "MyStorageConnectionAppSetting",
"direction": "out"
}
],
"disabled": false,
"scriptFile": "__init__.py"
}The configuration section explains these properties.
The dataType property determines which binding is used. The following values are available to support different binding strategies:
| Binding value | Default | Description | Example |
|---|---|---|---|
string |
N | Uses generic binding and casts the input type as a string |
def main(input: str) |
binary |
N | Uses generic binding and casts the input blob as bytes Python object |
def main(input: bytes) |
If the dataType property is not defined in function.json, the default value is string.
Here's the Python code:
import logging
import azure.functions as func
# The input binding field inputblob can either be 'bytes' or 'str' depends
# on dataType in function.json, 'binary' or 'string'.
def main(queuemsg: func.QueueMessage, inputblob: bytes) -> bytes:
logging.info(f'Python Queue trigger function processed {len(inputblob)} bytes')
return inputblob::: zone-end
::: zone pivot="programming-language-csharp"
Both in-process and isolated worker process C# libraries use attributes to define the function. C# script instead uses a function.json configuration file as described in the C# scripting guide.
Isolated worker process defines an input binding by using a BlobInputAttribute attribute, which takes the following parameters:
| Parameter | Description |
|---|---|
| BlobPath | The path to the blob. |
| Connection | The name of an app setting or setting collection that specifies how to connect to Azure Blobs. See Connections. |
In C# class libraries, use the BlobAttribute, which takes the following parameters:
| Parameter | Description |
|---|---|
| BlobPath | The path to the blob. |
| Connection | The name of an app setting or setting collection that specifies how to connect to Azure Blobs. See Connections. |
| Access | Indicates whether you will be reading or writing. |
The following example shows how the attribute's constructor takes the path to the blob and a FileAccess parameter indicating read for the input binding:
[FunctionName("BlobInput")]
public static void Run(
[QueueTrigger("myqueue-items")] string myQueueItem,
[Blob("samples-workitems/{queueTrigger}", FileAccess.Read)] Stream myBlob,
ILogger log)
{
log.LogInformation($"BlobInput processed blob\n Name:{myQueueItem} \n Size: {myBlob.Length} bytes");
}[!INCLUDE functions-bindings-storage-attribute]
[!INCLUDE app settings to local.settings.json]
::: zone-end
::: zone pivot="programming-language-python"
Applies only to the Python v2 programming model.
For Python v2 functions defined using decorators, the following properties on the blob_input and blob_output decorators define the Blob Storage triggers:
| Property | Description |
|---|---|
arg_name |
The name of the variable that represents the blob in function code. |
path |
The path to the blob For the blob_input decorator, it's the blob read. For the blob_output decorator, it's the output or copy of the input blob. |
connection |
The storage account connection string. |
data_type |
For dynamically typed languages, specifies the underlying data type. Possible values are string, binary, or stream. For more detail, refer to the triggers and bindings concepts. |
For Python functions defined by using function.json, see the Configuration section. ::: zone-end
::: zone pivot="programming-language-java"
The @BlobInput attribute gives you access to the blob that triggered the function. If you use a byte array with the attribute, set dataType to binary. Refer to the input example for details.
::: zone-end
::: zone pivot="programming-language-javascript,programming-language-typescript,programming-language-powershell,programming-language-python"
::: zone-end
::: zone pivot="programming-language-python" Applies only to the Python v1 programming model.
::: zone-end ::: zone pivot="programming-language-javascript,programming-language-typescript"
The following table explains the properties that you can set on the options object passed to the input.storageBlob() method.
| Property | Description |
|---|---|
| path | The path to the blob. |
| connection | The name of an app setting or setting collection that specifies how to connect to Azure Blobs. See Connections. |
The following table explains the binding configuration properties that you set in the function.json file.
| Property | Description |
|---|---|
| type | Must be set to blob. |
| direction | Must be set to in. Exceptions are noted in the usage section. |
| name | The name of the variable that represents the blob in function code. |
| path | The path to the blob. |
| connection | The name of an app setting or setting collection that specifies how to connect to Azure Blobs. See Connections. |
| dataType | For dynamically typed languages, specifies the underlying data type. Possible values are string, binary, or stream. For more detail, refer to the triggers and bindings concepts. |
::: zone-end
::: zone pivot="programming-language-powershell,programming-language-python"
The following table explains the binding configuration properties that you set in the function.json file.
| function.json property | Description |
|---|---|
| type | Must be set to blob. |
| direction | Must be set to in. Exceptions are noted in the usage section. |
| name | The name of the variable that represents the blob in function code. |
| path | The path to the blob. |
| connection | The name of an app setting or setting collection that specifies how to connect to Azure Blobs. See Connections. |
| dataType | For dynamically typed languages, specifies the underlying data type. Possible values are string, binary, or stream. For more detail, refer to the triggers and bindings concepts. |
::: zone-end
See the Example section for complete examples.
::: zone pivot="programming-language-csharp"
The binding types supported by Blob input depend on the extension package version and the C# modality used in your function app.
[!INCLUDE functions-bindings-storage-blob-input-dotnet-isolated-types]
See Binding types for a list of supported types.
Binding to string, or Byte[] is only recommended when the blob size is small, since the entire blob contents are loaded into memory. For most blobs, use a Stream or BlobClient type. For more information, see Concurrency and memory usage.
If you get an error message when trying to bind to one of the Storage SDK types, make sure that you have a reference to the correct Storage SDK version.
[!INCLUDE functions-bindings-blob-storage-attribute]
::: zone-end
::: zone pivot="programming-language-java"
The @BlobInput attribute gives you access to the blob that triggered the function. If you use a byte array with the attribute, set dataType to binary. Refer to the input example for details.
::: zone-end
::: zone pivot="programming-language-javascript,programming-language-typescript"
Access the blob data by using context.extraInputs.get().
Access the blob data by using context.bindings.<name> where <name> is the value specified in the name property of function.json.
::: zone-end
::: zone pivot="programming-language-powershell"
Access the blob data via a parameter that matches the name designated by binding's name parameter in the function.json file.
::: zone-end
::: zone pivot="programming-language-python"
Access blob data via the parameter typed as InputStream. Refer to the input example for details.
Functions also supports Python SDK type bindings for Azure Blob storage, which lets you work with blob data using these underlying SDK types:
Note
Only synchronous SDK types are supported.
Important
SDK types support for Python is generally available and is only supported for the Python v2 programming model. For more information, see SDK types in Python. ::: zone-end
[!INCLUDE functions-storage-blob-connections]