You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Large payload support lets your app pass orchestration inputs and activity outputs that exceed the [Durable Task Scheduler](durable-task-scheduler.md) message size limit. When a payload goes over the configured threshold, the framework stores the serialized payload in Azure Blob Storage and sends a small reference through Durable Task Scheduler.
17
17
18
-
This feature is available only for C# apps:
18
+
This feature is available for:
19
19
20
20
-[Durable Functions](../../azure-functions/durable-functions/durable-functions-overview.md) with the .NET isolated worker
If your workflow stores data in Blob Storage and passes only a URI or identifier, keep using that pattern. Use large payload support when your orchestration logic must pass the payload between durable operations. For general guidance, see [Data persistence and serialization in Durable Functions](../../azure-functions/durable-functions/durable-functions-serialization-and-persistence.md#keep-inputs-and-outputs-small).
24
25
@@ -29,8 +30,8 @@ This table shows large payload support by framework.
29
30
| Framework | Support status | What you need |
30
31
| --- | --- | --- |
31
32
| Durable Functions | Supported in .NET isolated C# | Use Durable Task Scheduler as the storage provider and use `AzureWebJobsStorage` for payload blobs |
32
-
| Durable Task SDKs | Supported in .NET | Use `Microsoft.DurableTask.Extensions.AzureBlobPayloads` with Azure Blob Storage |
33
-
| JavaScript, Python, PowerShell, and Java | Not available | Use external storage and pass references between durable operations |
33
+
| Durable Task SDKs | Supported in .NET and Python | Use the language-specific Azure Blob payload extension with Azure Blob Storage |
34
+
| JavaScript, PowerShell, and Java | Not available | Use external storage and pass references between durable operations |
34
35
35
36
## How it works
36
37
@@ -122,7 +123,7 @@ Large payload support with Durable Task Scheduler is available only for .NET iso
122
123
123
124
::: zone pivot="durable-task-sdks"
124
125
125
-
Large payload support in the Durable Task SDKs is available only for .NET apps.
126
+
Large payload support in the Durable Task SDKs is available for .NET and Python apps.
126
127
127
128
# [C#](#tab/csharp)
128
129
@@ -171,19 +172,60 @@ For an end-to-end .NET example, see the [Durable Task SDK large payload sample](
171
172
172
173
# [JavaScript](#tab/javascript)
173
174
174
-
Large payload support with Durable Task Scheduler is available only for the .NET Durable Task SDK.
175
+
Large payload support with Durable Task Scheduler is available for the .NET and Python Durable Task SDKs.
175
176
176
177
# [PowerShell](#tab/powershell)
177
178
178
-
Large payload support with Durable Task Scheduler is available only for the .NET Durable Task SDK.
179
+
Large payload support with Durable Task Scheduler is available for the .NET and Python Durable Task SDKs.
179
180
180
181
# [Python](#tab/python)
181
182
182
-
Large payload support with Durable Task Scheduler is available only for the .NET Durable Task SDK.
183
+
Install the Python SDK with the Azure Blob payload extension and the Azure Managed transport package:
Create a `BlobPayloadStore`, choose a threshold, and pass the same store to both the worker and the client:
190
+
191
+
```python
192
+
from durabletask.azuremanaged.client import DurableTaskSchedulerClient
193
+
from durabletask.azuremanaged.worker import DurableTaskSchedulerWorker
194
+
from durabletask.extensions.azure_blob_payloads import BlobPayloadStore, BlobPayloadStoreOptions
195
+
196
+
store = BlobPayloadStore(BlobPayloadStoreOptions(
197
+
connection_string=storage_connection_string,
198
+
container_name="durabletask-payloads",
199
+
threshold_bytes=900_000,
200
+
))
201
+
202
+
with DurableTaskSchedulerWorker(
203
+
host_address=endpoint,
204
+
secure_channel=secure_channel,
205
+
taskhub=taskhub_name,
206
+
token_credential=credential,
207
+
payload_store=store,
208
+
) as worker:
209
+
worker.start()
210
+
211
+
client = DurableTaskSchedulerClient(
212
+
host_address=endpoint,
213
+
secure_channel=secure_channel,
214
+
taskhub=taskhub_name,
215
+
token_credential=credential,
216
+
payload_store=store,
217
+
)
218
+
```
219
+
220
+
If you use Microsoft Entra ID instead of a storage connection string, set `account_url` and `credential` in `BlobPayloadStoreOptions`. The sample uses `DefaultAzureCredential`.
221
+
222
+
Keep `threshold_bytes` at or below `1,048,576` bytes. The current sample uses `1,024` bytes so you can see payload externalization happen during a local run.
223
+
224
+
For an end-to-end Python example, see the [Durable Task SDK large payload Python sample](https://github.com/microsoft/durabletask-python/tree/main/examples/large_payload).
183
225
184
226
# [Java](#tab/java)
185
227
186
-
Large payload support with Durable Task Scheduler is available only for the .NET Durable Task SDK.
228
+
Large payload support with Durable Task Scheduler is available for the .NET and Python Durable Task SDKs.
187
229
188
230
---
189
231
@@ -230,7 +272,7 @@ Large payload support with Durable Task Scheduler is available only for .NET iso
230
272
231
273
::: zone pivot="durable-task-sdks"
232
274
233
-
Use these environment variables with the current .NET Durable Task SDK sample.
275
+
Use these environment variables with the current Durable Task SDK samples.
234
276
235
277
# [C#](#tab/csharp)
236
278
@@ -250,19 +292,25 @@ If `PAYLOAD_STORAGE_CONNECTION_STRING` isn't set and `PAYLOAD_STORAGE_ACCOUNT_UR
|`STORAGE_CONNECTION_STRING`| Blob storage connection string for externalized payloads |`UseDevelopmentStorage=true`|
308
+
309
+
The Python sample sets `threshold_bytes` in code with `BlobPayloadStoreOptions`.
262
310
263
311
# [Java](#tab/java)
264
312
265
-
This sample is shown for .NET, Java, and Python.
313
+
Sample settings are shown for .NET and Python.
266
314
267
315
---
268
316
@@ -297,6 +345,7 @@ The sample apps also validate the round trip:
297
345
298
346
- The Durable Functions samples return a small summary object that confirms the input and output sizes.
299
347
- The .NET Durable Task SDK sample prints whether the run creates new payload blobs.
348
+
- The Python Durable Task SDK sample runs both inline and externalized payload flows and prints the orchestration result for each run.
300
349
301
350
Because the runtime stores externalized payloads with gzip content encoding, Azure reports the compressed on-disk blob size. With the current low-compressibility sample payloads, those blob sizes should stay reasonably close to the logical payload size.
Copy file name to clipboardExpand all lines: articles/durable-task/scheduler/durable-task-scheduler.md
+1-1Lines changed: 1 addition & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -180,7 +180,7 @@ Stale orchestration data should be purged periodically to ensure efficient stora
180
180
| Orchestration custom status | 1 MB |
181
181
| Entity state | 1 MB |
182
182
183
-
If you need to pass larger payloads, use [large payload support with Durable Task Scheduler](./durable-task-scheduler-large-payloads.md). That option is currently available only for C# apps in Durable Functions .NET isolated and the .NET Durable Task SDK.
183
+
If you need to pass larger payloads, use [large payload support with Durable Task Scheduler](./durable-task-scheduler-large-payloads.md), if it is available in your language and SDK.
0 commit comments