You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
In this document, we will primarily focus on learning fundamental concepts with various examples to explore the ability to create parameterized data pipelines within Azure Data Factory. Parameterization and dynamic expressions are such notable additions to ADF because they can save a tremendous amount of time and allow for a much more flexible Extract, Transform, Load (ETL) or Extract, Load, Transform (ELT) solution, which will dramatically reduce the cost of solution maintenance and speed up the implementation of new features into existing pipelines. These gains are because parameterization minimizes the amount of hard coding and increases the number of reusable objects and processes in a solution.
17
+
This article focuses on fundamental concepts and examples that help you create parameterized data pipelines in Azure Data Factory. Parameterization and dynamic expressions add flexibility to ADF and can save time by allowing for more flexible Extract, Transform, Load (ETL) or Extract, Load, Transform (ELT) solutions. These features reduce solution maintenance costs and speed up the implementation of new features into existing pipelines. Parameterization minimizes hard coding and increases the number of reusable objects and processes in a solution.
17
18
18
19
## Azure Data Factory UI and parameters
19
20
20
-
If you are new to Azure Data Factory parameter usage in ADF user interface, please review [Data Factory UI for linked services with parameters](./parameterize-linked-services.md#ui-experience) and [Data Factory UI for metadata driven pipeline with parameters](./how-to-use-trigger-parameterization.md#data-factory-ui) for a visual explanation.
21
+
If you're new to Azure Data Factory parameter usage in the ADF user interface, review [Data Factory UI for linked services with parameters](./parameterize-linked-services.md#ui-experience) and [Data Factory UI for metadata driven pipeline with parameters](./how-to-use-trigger-parameterization.md#data-factory-ui) for a visual explanation.
21
22
22
-
## Parameter and expression concepts
23
+
## Parameter and expression concepts
23
24
24
-
You can use parameters to pass external values into pipelines, datasets, linked services, and data flows. Once the parameter has been passed into the resource, it cannot be changed. By parameterizing resources, you can reuse them with different values each time. Parameters can be used individually or as a part of expressions. JSON values in the definition can be literal or expressions that are evaluated at runtime.
25
+
You can use parameters to pass external values into pipelines, datasets, linked services, and data flows. After you pass a parameter into a resource, you can't change it. When you parameterize resources, you can reuse them with different values each time. You can use parameters individually or as part of expressions. JSON values in the definition can be literal values or expressions that are evaluated at runtime.
25
26
26
27
For example:
27
28
@@ -35,18 +36,18 @@ For example:
35
36
"name": "@pipeline().parameters.password"
36
37
```
37
38
38
-
Expressions can appear anywhere in a JSON string value and always result in another JSON value. Here, *password* is a pipeline parameter in the expression. If a JSON value is an expression, the body of the expression is extracted by removing the at-sign (\@). If a literal string is needed that starts with \@, it must be escaped by using \@\@. The following examples show how expressions are evaluated.
39
+
Expressions can appear anywhere in a JSON string value and always result in another JSON value. Here, *password* is a pipeline parameter in the expression. If a JSON value is an expression, the body of the expression is extracted by removing the at-sign (\@). If you need a literal string that starts with \@, you need to escape it by using \@\@. The following examples show how expressions are evaluated.
39
40
40
41
|JSON value|Result|
41
42
|----------------|------------|
42
43
|"parameters"|The characters 'parameters' are returned.|
43
44
|"parameters[1]"|The characters 'parameters[1]' are returned.|
44
-
|"\@\@"|A 1character string that contains '\@' is returned.|
45
-
|" \@"|A 2character string that contains ' \@' is returned.|
45
+
|"\@\@"|A 1-character string that contains '\@' is returned.|
46
+
|" \@"|A 2-character string that contains ' \@' is returned.|
46
47
47
-
Expressions can also appear inside strings, using a feature called *string interpolation* where expressions are wrapped in `@{ ... }`. For example: `"name" : "First Name: @{pipeline().parameters.firstName} Last Name: @{pipeline().parameters.lastName}"`
48
+
Expressions can also appear inside strings. This feature is called *string interpolation* where expressions are wrapped in `@{ ... }`. For example: `"name" : "First Name: @{pipeline().parameters.firstName} Last Name: @{pipeline().parameters.lastName}"`
48
49
49
-
Using string interpolation, the result is always a string. Say I have defined`myNumber` as `42` and `myString` as`foo`:
50
+
When you use string interpolation, the result is always a string. Let's say you define`myNumber` as `42` and `myString` as `foo`:
50
51
51
52
|JSON value|Result|
52
53
|----------------|------------|
@@ -58,16 +59,17 @@ Expressions can appear anywhere in a JSON string value and always result in anot
58
59
|"\@concat('Answer is: ', string(pipeline().parameters.myNumber))"| Returns the string `Answer is: 42`|
59
60
|"Answer is: \@\@{pipeline().parameters.myNumber}"| Returns the string `Answer is: @{pipeline().parameters.myNumber}`.|
60
61
61
-
## Examples of using parameters in expressions
62
+
## Examples of parameters in expressions
62
63
63
64
### Complex expression example
64
-
The below example shows a complex example that references a deep sub-field of activity output. To reference a pipeline parameter that evaluates to a sub-field, use [] syntax instead of dot(.) operator (as in case of subfield1 and subfield2)
65
+
66
+
The following example shows a complex example that references a deep sub-field of activity output. To reference a pipeline parameter that evaluates to a sub-field, use [] syntax instead of the dot(.) operator (as in the case of subfield1 and subfield2).
Dynamic content editor automatically escapes characters in your content when you finish editing. For example, the following content in content editor is a string interpolation with two expression functions.
72
+
The dynamic content editor automatically escapes characters in your content when you finish editing. For example, the following content in the content editor is a string interpolation with two expression functions.
71
73
72
74
```json
73
75
{
@@ -76,7 +78,7 @@ Dynamic content editor automatically escapes characters in your content when you
76
78
}
77
79
```
78
80
79
-
Dynamic content editor converts above content to expression `"{ \n \"type\": \"@{if(equals(1, 2), 'Blob', 'Table' )}\",\n \"name\": \"@{toUpper('myData')}\"\n}"`. The result of this expression is a JSON format string shown below.
81
+
The dynamic content editor converts the content above to the expression `"{ \n \"type\": \"@{if(equals(1, 2), 'Blob', 'Table' )}\",\n \"name\": \"@{toUpper('myData')}\"\n}"`. The result of this expression is a JSON format string, shown below.
In the following example, the BlobDataset takes a parameter named **path**. Its value is used to set a value for the **folderPath** property by using the expression: `dataset().path`.
92
+
In the following example, the BlobDataset takes a parameter named **path**. Its value sets a value for the **folderPath** property by using the expression: `dataset().path`.
91
93
92
94
```json
93
95
{
@@ -110,9 +112,9 @@ In the following example, the BlobDataset takes a parameter named **path**. Its
110
112
}
111
113
```
112
114
113
-
### A pipeline with parameters
115
+
### A pipeline with parameters
114
116
115
-
In the following example, the pipeline takes **inputPath** and **outputPath** parameters. The **path** for the parameterized blob dataset is set by using values of these parameters. The syntax used here is: `pipeline().parameters.parametername`.
117
+
In the following example, the pipeline takes **inputPath** and **outputPath** parameters. The **path** for the parameterized blob dataset is set by using the values of these parameters. The syntax used here is: `pipeline().parameters.parametername`.
116
118
117
119
```json
118
120
{
@@ -162,16 +164,13 @@ In the following example, the pipeline takes **inputPath** and **outputPath** pa
162
164
}
163
165
```
164
166
165
-
166
-
## Calling functions within expressions
167
+
## Call functions within expressions
167
168
168
-
You can call functions within expressions. The following sections provide information about the functions that can be used in an expression.
169
+
You can call functions within expressions. The following sections provide information about the functions that you can use in an expression.
169
170
170
171
### String functions
171
172
172
-
To work with strings, you can use these string functions
173
-
and also some [collection functions](#collection-functions).
174
-
String functions work only on strings.
173
+
To work with strings, you can use these string functions and also some [collection functions](#collection-functions). String functions work only on strings.
175
174
176
175
| String function | Task |
177
176
| --------------- | ---- |
@@ -190,8 +189,7 @@ String functions work only on strings.
190
189
191
190
### Collection functions
192
191
193
-
To work with collections, generally arrays, strings,
194
-
and sometimes, dictionaries, you can use these collection functions.
192
+
To work with collections, generally arrays, strings, and sometimes dictionaries, you can use these collection functions.
195
193
196
194
| Collection function | Task |
197
195
| ------------------- | ---- |
@@ -208,7 +206,7 @@ and sometimes, dictionaries, you can use these collection functions.
208
206
209
207
### Logical functions
210
208
211
-
These functions are useful inside conditions, they can be used to evaluate any type of logic.
209
+
These functions are useful inside conditions. You can use them to evaluate any type of logic.
212
210
213
211
| Logical comparison function | Task |
214
212
| --------------------------- | ---- |
@@ -224,13 +222,14 @@ These functions are useful inside conditions, they can be used to evaluate any t
224
222
225
223
### Conversion functions
226
224
227
-
These functions are used to convert between each of the native types in the language:
228
-
- string
229
-
- integer
230
-
- float
231
-
- boolean
232
-
- arrays
233
-
- dictionaries
225
+
You can use these functions to convert between each of the native types in the language:
226
+
227
+
- string
228
+
- integer
229
+
- float
230
+
- boolean
231
+
- arrays
232
+
- dictionaries
234
233
235
234
| Conversion function | Task |
236
235
| ------------------- | ---- |
@@ -259,8 +258,9 @@ These functions are useful inside conditions, they can be used to evaluate any t
259
258
|[xml](control-flow-expression-language-functions.md#xml)| Return the XML version for a string. |
260
259
|[xpath](control-flow-expression-language-functions.md#xpath)| Check XML for nodes or values that match an XPath (XML Path Language) expression, and return the matching nodes or values. |
261
260
262
-
### Math functions
263
-
These functions can be used for either types of numbers: **integers** and **floats**.
261
+
### Math functions
262
+
263
+
You can use these functions for either type of number: **integers** and **floats**.
264
264
265
265
| Math function | Task |
266
266
| ------------- | ---- |
@@ -301,18 +301,18 @@ These functions are useful inside conditions, they can be used to evaluate any t
301
301
302
302
## Detailed examples for practice
303
303
304
-
### Detailed Azure Data Factory copy pipeline with parameters
305
-
306
-
This [Azure Data Factory copy pipeline parameter passing tutorial](https://azure.microsoft.com/mediahandler/files/resourcefiles/azure-data-factory-passing-parameters/Azure%20data%20Factory-Whitepaper-PassingParameters.pdf) walks you through how to pass parameters between a pipeline and activity as well as between the activities.
304
+
### Azure Data Factory copy pipeline with parameters
307
305
308
-
### Detailed Mapping data flow pipeline with parameters
306
+
This [Azure Data Factory copy pipeline parameter passing tutorial](https://azure.microsoft.com/mediahandler/files/resourcefiles/azure-data-factory-passing-parameters/Azure%20data%20Factory-Whitepaper-PassingParameters.pdf) walks you through how to pass parameters between a pipeline and activity, and also between activities.
309
307
310
-
Please follow [Mapping data flow with parameters](./parameters-data-flow.md) for comprehensive example on how to use parameters in data flow.
308
+
### Mapping data flow pipeline with parameters
311
309
312
-
### Detailed Metadata driven pipeline with parameters
310
+
Follow the [Mapping data flow with parameters](./parameters-data-flow.md) guide for a comprehensive example of how to use parameters in data flow.
313
311
314
-
Please follow [Metadata driven pipeline with parameters](./how-to-use-trigger-parameterization.md) to learn more about how to use parameters to design metadata driven pipelines. This is a popular use case for parameters.
312
+
### Metadata driven pipeline with parameters
315
313
314
+
Follow the [Metadata driven pipeline with parameters](./how-to-use-trigger-parameterization.md) guide to learn more about how to use parameters to design metadata driven pipelines. This is a common use case for parameters.
316
315
317
316
## Related content
317
+
318
318
For a list of system variables you can use in expressions, see [System variables](control-flow-system-variables.md).
0 commit comments