You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: articles/azure-functions/durable/durable-functions-azure-storage-provider.md
+1-1Lines changed: 1 addition & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -249,7 +249,7 @@ As an example, if `durableTask/extendedSessionIdleTimeoutInSeconds` is set to 30
249
249
The specific effects of extended sessions on orchestrator and entity functions are described in the next sections.
250
250
251
251
> [!NOTE]
252
-
> In the .NET isolated model, the extended sessions feature is currently only supported in orchestrations (not entities). Additionally, this feature is available only for .NET languages such as C# and F#. Setting `extendedSessionsEnabled` to `true` for other platforms can lead to runtime issues, such as silently failing to execute activity and orchestration-triggered functions.
252
+
> This feature is available only for .NET languages such as C# (isolated and in-process models) and F#. Setting `extendedSessionsEnabled` to `true` for other platforms can lead to runtime issues, such as silently failing to execute activity and orchestration-triggered functions.
Copy file name to clipboardExpand all lines: articles/service-connector/tutorial-csharp-webapp-storage-cli.md
+11-11Lines changed: 11 additions & 11 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -1,18 +1,18 @@
1
1
---
2
-
title: Deploy a webapp connected to Azure Blob Storage
2
+
title: Deploy a Webapp Connected to Azure Blob Storage
3
3
description: This tutorial guides you through creating and deploying a web application that connects to Azure Blob Storage using Service Connector.
4
4
author: maud-lv
5
5
ms.author: malev
6
6
ms.service: service-connector
7
7
ms.topic: tutorial
8
-
ms.date: 12/18/2024
8
+
ms.date: 12/02/2025
9
9
ms.devlang: azurecli
10
10
ms.custom: devx-track-azurecli
11
11
---
12
12
13
13
# Tutorial: Deploy a web application connected to Azure Blob Storage with Service Connector
14
14
15
-
In this tutorial, you learn how to access Azure Blob Storage for a web app (not a signed-in user) running on Azure App Service by using managed identities. In this tutorial, you'll use the Azure CLI to complete the following tasks:
15
+
In this tutorial, you learn how to access Azure Blob Storage for a web app (not a signed-in user) running on Azure App Service by using managed identities. In this tutorial, you use the Azure CLI to complete the following tasks:
16
16
17
17
> [!div class="checklist"]
18
18
>
@@ -22,7 +22,7 @@ In this tutorial, you learn how to access Azure Blob Storage for a web app (not
22
22
23
23
## Prerequisites
24
24
25
-
* An Azure account with an active subscription. [Create an account for free](https://azure.microsoft.com/pricing/purchase-options/azure-account?cid=msft_learn).
25
+
- An Azure account with an active subscription. [Create an account for free](https://azure.microsoft.com/pricing/purchase-options/azure-account?cid=msft_learn).
@@ -64,12 +64,12 @@ In this tutorial, you learn how to access Azure Blob Storage for a web app (not
64
64
65
65
1. In the terminal, make sure you're in the *WebAppStorageMISample* repository folder that contains the app code.
66
66
67
-
1. Create an App Service app (the host process) with the [`az webapp up`](/cli/azure/webapp#az-webapp-up) command below and replace the placeholders with your own data:
67
+
1. Create an App Service app (the host process) with the [`az webapp up`](/cli/azure/webapp#az-webapp-up) command and replace the following placeholders with your own data:
68
68
69
-
* For the `--location` argument, use a [region supported by Service Connector](concept-region-support.md).
70
-
* Replace `<app-name>` with a unique name across Azure. The server endpoint is `https://<app-name>.azurewebsites.net`. Allowed characters for `<app-name>` are `A`-`Z`, `0`-`9`, and `-`. A good pattern is to use a combination of your company name and an app identifier.
69
+
- For the `--location` argument, use a [region supported by Service Connector](concept-region-support.md).
70
+
- Replace `<app-name>` with a unique name across Azure. The server endpoint is `https://<app-name>.azurewebsites.net`. Allowed characters for `<app-name>` are `A`-`Z`, `0`-`9`, and `-`. A good pattern is to use a combination of your company name and an app identifier.
71
71
72
-
```azurecli
72
+
```azurecli
73
73
az webapp up --name <app-name> --sku B1 --location eastus --resource-group ServiceConnector-tutorial-rg
Replace the following placeholders with your own data:
95
95
96
-
* Replace `<app-name>` with the web app name you used in step 3.
97
-
* Replace `<storage-name>` with the storage app name you used in step 4.
96
+
- Replace `<app-name>` with the web app name you used in step 3.
97
+
- Replace `<storage-name>` with the storage app name you used in step 4.
98
98
99
99
> [!NOTE]
100
100
> If you see the error message "The subscription is not registered to use Microsoft.ServiceLinker", run `az provider register -n Microsoft.ServiceLinker` to register the Service Connector resource provider and run the connection command again.
@@ -111,7 +111,7 @@ The sample code is a web application. Each time you refresh the index page, the
111
111
112
112
## Next step
113
113
114
-
To learn more about Service Connector, read the guide below.
114
+
To learn more about Service Connector, read the following guide.
Copy file name to clipboardExpand all lines: articles/service-connector/tutorial-python-functions-storage-queue-as-trigger.md
+14-14Lines changed: 14 additions & 14 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -1,12 +1,12 @@
1
1
---
2
-
title: 'Tutorial: Python function with Azure Queue Storage as trigger'
3
-
description: Learn how you can connect a Python function to a storage queue as trigger using Service Connector
2
+
title: 'Tutorial: Python Function with Azure Queue Storage as Trigger'
3
+
description: Learn how you can connect a Python function to a storage queue as trigger using Service Connector.
4
4
author: houk-ms
5
5
ms.author: honc
6
6
ms.service: service-connector
7
7
ms.custom: devx-track-python
8
8
ms.topic: tutorial
9
-
ms.date: 10/22/2024
9
+
ms.date: 12/04/2025
10
10
---
11
11
# Tutorial: Python function with Azure Queue Storage as trigger
12
12
@@ -29,7 +29,7 @@ An overview of the function project components in this tutorial:
29
29
| Cloud Function Auth Type | Connection String |
30
30
31
31
> [!WARNING]
32
-
> Microsoft recommends that you use the most secure authentication flow available. The authentication flow described in this procedure requires a very high degree of trust in the application, and carries risks that are not present in other flows. You should only use this flow when other more secure flows, such as managed identities, aren't viable.
32
+
> Microsoft recommends that you use the most secure authentication flow available. The authentication flow described in this procedure requires a high degree of trust in the application, and carries risks that aren't present in other flows. You should only use this flow when other more secure flows, such as managed identities, aren't viable.
33
33
34
34
## Prerequisites
35
35
@@ -40,7 +40,7 @@ An overview of the function project components in this tutorial:
40
40
41
41
## Create a Python function project
42
42
43
-
Follow the [tutorial to create a local Azure Functions project](../azure-functions/how-to-create-function-vs-code.md?pivot=programming-language-python?pivots=python-mode-configuration#create-an-azure-functions-project), and provide the following information at the prompts:
43
+
Follow the [tutorial to create a local Azure Functions project](../azure-functions/how-to-create-function-vs-code.md?pivot=programming-language-python?pivots=python-mode-configuration#create-an-azure-functions-project). Provide the following information at the prompts:
@@ -50,29 +50,29 @@ Follow the [tutorial to create a local Azure Functions project](../azure-functio
50
50
|**Provide a function name**| Enter `QueueStorageTriggerFunc`. |
51
51
|**Select setting from "local.settings.json"**| Choose `Create new local app settings`, which lets you select your Storage Account and provide your queue name that works as the trigger. |
52
52
53
-
You created a Python function project with Azure Storage Queue as trigger. The local project connects to Azure Storage using the connection string saved into the `local.settings.json` file. Finally, the `main` function in `__init__.py` file of the function can consume the connection string with the help of the Function Binding defined in the `function.json` file.
53
+
With this information, Visual Studio Code generates a Python function project with Azure Storage Queue as the trigger. The local project connects to Azure Storage using the connection string saved into the `local.settings.json` file. Finally, the `main` function in the`__init__.py` file of the function can consume the connection string with the help of the Function Binding defined in the `function.json` file.
54
54
55
55
## Run the function locally
56
56
57
57
Follow the [tutorial](../azure-functions/how-to-create-function-vs-code.md?pivot=programming-language-python?pivots=python-mode-configuration#run-the-function-locally) to run the function locally and verify the trigger.
58
58
59
-
1.Select the storage account as you chose when creating the Azure Function resource if you're prompted to connect to storage. This value is used for Azure Function's runtime, and it isn't necessarily the same as the storage account you use for the trigger.
59
+
1.If you're prompted to connect to storage, select the storage account you chose when creating the Azure Function resource. This value is used for Azure Function's runtime, and it isn't necessarily the same as the storage account you use for the trigger.
60
60
1. To start the function locally, press `<kbd>`F5 `</kbd>` or select the **Run and Debug** icon in the left-hand side Activity bar.
61
-
1. To verify the trigger works properly, keep the function running locally and open the Storage Queue pane in Azure portal, select**Add message** and provide a test message. You should see the function is triggered and processed as a queue item in your Visual Studio Code terminal.
61
+
1. To verify the trigger works properly, keep the function running locally and open the Storage Queue pane in Azure portal. Select**Add message**, and provide a test message. You should see the function is triggered and processed as a queue item in your Visual Studio Code terminal.
62
62
63
63
## Create a connection using Service Connector
64
64
65
-
In last step, you verified the function project locally. Now you'll learn how to configure the connection between the Azure Function and Azure Storage Queue in the cloud, so that your function can be triggered by the storage queue after being deployed to the cloud.
65
+
In the last step, you verified the function project locally. Next, you learn how to configure the connection between the Azure Function and Azure Storage Queue. Once your function is deployed to the cloud, this connection allows your function to receive triggers from the storage queue.
66
66
67
-
1. Open the `function.json` file in your local project, change the value of the `connection` property in `bindings` to be `AZURE_STORAGEQUEUE_CONNECTIONSTRING`.
67
+
1. Open the `function.json` file in your local project, and change the value of the `connection` property in `bindings` to be `AZURE_STORAGEQUEUE_CONNECTIONSTRING`.
68
68
1. Run the following Azure CLI command to create a connection between your Azure Function and your Azure storage account.
69
69
70
70
```azurecli
71
71
az functionapp connection create storage-queue --source-id "<your-function-resource-id>" --target-id "<your-storage-queue-resource-id>" --secret
This step creates a Service Connector resource that configures an `AZURE_STORAGEQUEUE_CONNECTIONSTRING` variable in the function's App Settings. The function binding runtime uses it to connect to the storage, so that the function can accept triggers from the storage queue. For more information, go to [how Service Connector helps Azure Functions connect to services](./how-to-use-service-connector-in-function.md).
78
78
@@ -81,7 +81,7 @@ This step creates a Service Connector resource that configures an `AZURE_STORAGE
81
81
Now you can deploy your function to Azure and verify the storage queue trigger works.
82
82
83
83
1. Follow this [Azure Functions tutorial](../azure-functions/how-to-create-function-vs-code.md?pivot=programming-language-python?pivots=python-mode-configuration#deploy-the-project-to-azure) to deploy your function to Azure.
84
-
1. Open the Storage Queue pane in the Azure portal, select**Add message** and provide a test message. You should see the function is triggered and processed as a queue item in your function logs.
84
+
1. Open the Storage Queue pane in the Azure portal. Select**Add message**, and provide a test message. You should see the function is triggered and processed as a queue item in your function logs.
Copy file name to clipboardExpand all lines: articles/storage/container-storage/use-container-storage-with-local-disk.md
+1-1Lines changed: 1 addition & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -278,7 +278,7 @@ In this section, you learn how to check node ephemeral disk capacity, expand sto
278
278
279
279
An ephemeral volume is allocated on a single node. When you configure the size of your ephemeral volumes, the size should be less than the available capacity of the single node's ephemeral disk.
280
280
281
-
Make sure a StorageClass for **localdisk.csi.acstor.io** exists. Run the following command to check the available capacity of ephemeral disk for each node.
281
+
Make sure a StorageClass for `localdisk.csi.acstor.io` exists. Run the following command to check the available capacity of ephemeral disk for each node.
282
282
283
283
```azurecli
284
284
kubectl get csistoragecapacities.storage.k8s.io -n kube-system -o custom-columns=NAME:.metadata.name,STORAGE_CLASS:.storageClassName,CAPACITY:.capacity,NODE:.nodeTopology.matchLabels."topology\.localdisk\.csi\.acstor\.io/node"
0 commit comments