Skip to content

Commit f4162e6

Browse files
committed
docs: convert YAML code blocks to tables in expressions reference
1 parent 58cd0b7 commit f4162e6

1 file changed

Lines changed: 57 additions & 175 deletions

File tree

articles/iot-operations/connect-to-cloud/concept-dataflow-graphs-expressions.md

Lines changed: 57 additions & 175 deletions
Original file line numberDiff line numberDiff line change
@@ -195,19 +195,15 @@ Expression: `$1 + "/" + $2`
195195

196196
In the following example, the MQTT `topic` property is mapped to the `origin_topic` field in the output:
197197

198-
```yaml
199-
inputs:
200-
- $metadata.topic
201-
output: origin_topic
202-
```
198+
| Input | Output |
199+
|-------|--------|
200+
| `$metadata.topic` | `origin_topic` |
203201

204202
If the user property `priority` is present in the MQTT message, the following example demonstrates how to map it to an output field:
205203

206-
```yaml
207-
inputs:
208-
- $metadata.user_property.priority
209-
output: priority
210-
```
204+
| Input | Output |
205+
|-------|--------|
206+
| `$metadata.user_property.priority` | `priority` |
211207

212208
### Write to metadata
213209

@@ -217,27 +213,21 @@ Setting a metadata field to an empty value (`()`) removes it. For user propertie
217213

218214
You can also map metadata properties to an output header or user property. In the following example, the MQTT `topic` is mapped to the `origin_topic` field in the output's user property:
219215

220-
```yaml
221-
inputs:
222-
- $metadata.topic
223-
output: $metadata.user_property.origin_topic
224-
```
216+
| Input | Output |
217+
|-------|--------|
218+
| `$metadata.topic` | `$metadata.user_property.origin_topic` |
225219

226220
If the incoming payload contains a `priority` field, the following example demonstrates how to map it to an MQTT user property:
227221

228-
```yaml
229-
inputs:
230-
- priority
231-
output: $metadata.user_property.priority
232-
```
222+
| Input | Output |
223+
|-------|--------|
224+
| `priority` | `$metadata.user_property.priority` |
233225

234226
The same example for Kafka:
235227

236-
```yaml
237-
inputs:
238-
- priority
239-
output: $metadata.header.priority
240-
```
228+
| Input | Output |
229+
|-------|--------|
230+
| `priority` | `$metadata.header.priority` |
241231

242232
Metadata fields are supported in map, filter, and branch rules. They aren't available in window (accumulate) rules.
243233

@@ -314,28 +304,11 @@ JSON objects and arrays are preserved as-is when fields are copied without an ex
314304

315305
## Dot notation and escaping
316306

317-
Dot notation is widely used to reference nested fields. A standard dot-notation sample looks like this:
307+
Dot notation is widely used to reference nested fields. A standard dot-notation path looks like `Person.Address.Street.Number`.
318308

319-
```yaml
320-
- inputs:
321-
- Person.Address.Street.Number
322-
```
323-
324-
In a data flow, a path described by dot notation might include strings and some special characters without needing escaping:
325-
326-
```yaml
327-
- inputs:
328-
- Person.Date of Birth
329-
```
309+
In a data flow, a path described by dot notation might include strings and some special characters without needing escaping, such as `Person.Date of Birth`.
330310

331-
In other cases, escaping is necessary:
332-
333-
```yaml
334-
- inputs:
335-
- nsu=http://opcfoundation.org/UA/Plc/Applications;s=RandomSignedInt32
336-
```
337-
338-
The previous example, among other special characters, contains dots within the field name. Without escaping, the field name would serve as a separator in the dot notation itself.
311+
In other cases, escaping is necessary, for example: `nsu=http://opcfoundation.org/UA/Plc/Applications;s=RandomSignedInt32`. This path, among other special characters, contains dots within the field name. Without escaping, the field name would serve as a separator in the dot notation itself.
339312

340313
While a data flow parses a path, it treats only two characters as special:
341314

@@ -344,56 +317,18 @@ While a data flow parses a path, it treats only two characters as special:
344317

345318
Any other characters are treated as part of the field name. This flexibility is useful in formats like JSON, where field names can be arbitrary strings.
346319

347-
The path definition must also adhere to the rules of YAML. When a character with special meaning is included in the path, proper quoting is required in the configuration. Consult the YAML documentation for precise rules. Here are some examples that demonstrate the need for careful formatting:
348-
349-
```yaml
350-
- inputs:
351-
- ':Person:.:name:' # ':' cannot be used as the first character without single quotation marks
352-
- '100 celsius.hot' # numbers followed by text would not be interpreted as a string without single quotation marks
353-
```
320+
The path definition must also adhere to the rules of the configuration format. When a character with special meaning is included in the path, proper quoting is required. For example, field names that start with a colon (like `:Person:.:name:`) or that begin with a number followed by text (like `100 celsius.hot`) need quoting in the configuration to be interpreted correctly as strings.
354321

355322
### Escaping
356323

357-
The primary function of escaping in a dot-notated path is to accommodate the use of dots that are part of field names rather than separators:
358-
359-
```yaml
360-
- inputs:
361-
- 'Payload."Tag.10".Value'
362-
```
363-
364-
The outer single quotation marks (`'`) are necessary because of YAML syntax rules, which allow the inclusion of double quotation marks within the string.
365-
366-
In this example, the path consists of three segments: `Payload`, `Tag.10`, and `Value`.
324+
The primary function of escaping in a dot-notated path is to accommodate the use of dots that are part of field names rather than separators. For example, the path `Payload."Tag.10".Value` consists of three segments: `Payload`, `Tag.10`, and `Value`. The double quotation marks around `Tag.10` prevent the dot from acting as a separator.
367325

368326
### Escaping rules in dot notation
369327

370-
* **Escape each segment separately:** If multiple segments contain dots, those segments must be enclosed in double quotation marks. Other segments can also be quoted, but it doesn't affect the path interpretation:
371-
372-
373-
```yaml
374-
- inputs:
375-
- 'Payload."Tag.10".Measurements."Vibration.$12".Value'
376-
```
328+
* **Escape each segment separately:** If multiple segments contain dots, those segments must be enclosed in double quotation marks. Other segments can also be quoted, but it doesn't affect the path interpretation. For example: `Payload."Tag.10".Measurements."Vibration.$12".Value`
377329

378330

379-
* **Proper use of double quotation marks:** Double quotation marks must open and close an escaped segment. Any quotation marks in the middle of the segment are considered part of the field name:
380-
381-
382-
```yaml
383-
- inputs:
384-
- 'Payload.He said: "Hello", and waved'
385-
```
386-
387-
This example defines two fields: `Payload` and `He said: "Hello", and waved`. When a dot appears under these circumstances, it continues to serve as a separator:
388-
389-
390-
```yaml
391-
- inputs:
392-
- 'Payload.He said: "No. It is done"'
393-
```
394-
395-
396-
In this case, the path is split into the segments `Payload`, `He said: "No`, and `It is done"` (starting with a space).
331+
* **Proper use of double quotation marks:** Double quotation marks must open and close an escaped segment. Any quotation marks in the middle of the segment are considered part of the field name. For example, the path `Payload.He said: "Hello", and waved` defines two fields: `Payload` and `He said: "Hello", and waved`. When a dot appears under these circumstances, it continues to serve as a separator. For example, the path `Payload.He said: "No. It is done"` is split into the segments `Payload`, `He said: "No`, and `It is done"` (starting with a space).
397332

398333
### Segmentation algorithm
399334

@@ -406,11 +341,9 @@ In many scenarios, the output record closely resembles the input record, with on
406341

407342
Let's consider a basic scenario to understand the use of asterisks in mappings:
408343

409-
```yaml
410-
- inputs:
411-
- '*'
412-
output: '*'
413-
```
344+
| Input | Output |
345+
|-------|--------|
346+
| `*` | `*` |
414347

415348
This configuration shows a basic mapping where every field in the input is directly mapped to the same field in the output without any changes. The asterisk (`*`) serves as a wildcard that matches any field in the input record.
416349

@@ -444,15 +377,10 @@ Original JSON:
444377

445378
Mapping configuration that uses wildcards:
446379

447-
```yaml
448-
- inputs:
449-
- 'ColorProperties.*'
450-
output: '*'
451-
452-
- inputs:
453-
- 'TextureProperties.*'
454-
output: '*'
455-
```
380+
| Input | Output |
381+
|-------|--------|
382+
| `ColorProperties.*` | `*` |
383+
| `TextureProperties.*` | `*` |
456384

457385
Resulting JSON:
458386

@@ -504,13 +432,9 @@ Original JSON:
504432

505433
Mapping configuration that uses wildcards:
506434

507-
```yaml
508-
- inputs:
509-
- '*.Max' # - $1
510-
- '*.Min' # - $2
511-
output: 'ColorProperties.*'
512-
expression: ($1 + $2) / 2
513-
```
435+
| Input | Output | Expression |
436+
|-------|--------|------------|
437+
| `*.Max` ($1), `*.Min` ($2) | `ColorProperties.*` | `($1 + $2) / 2` |
514438

515439
Resulting JSON:
516440

@@ -561,15 +485,9 @@ Original JSON:
561485

562486
Initial mapping configuration that uses wildcards:
563487

564-
```yaml
565-
- inputs:
566-
- '*.Max' # - $1
567-
- '*.Min' # - $2
568-
- '*.Avg' # - $3
569-
- '*.Mean' # - $4
570-
output: 'ColorProperties.*'
571-
expression: ($1, $2, $3, $4)
572-
```
488+
| Input | Output | Expression |
489+
|-------|--------|------------|
490+
| `*.Max` ($1), `*.Min` ($2), `*.Avg` ($3), `*.Mean` ($4) | `ColorProperties.*` | `($1, $2, $3, $4)` |
573491

574492
This initial mapping tries to build an array (for example, for `Opacity`: `[0.88, 0.91, 0.89, 0.89]`). This configuration fails because:
575493

@@ -584,35 +502,20 @@ Because `Avg` and `Mean` are nested within `Mid`, the asterisk in the initial ma
584502

585503
Corrected mapping configuration:
586504

587-
```yaml
588-
- inputs:
589-
- '*.Max' # - $1
590-
- '*.Min' # - $2
591-
- '*.Mid.Avg' # - $3
592-
- '*.Mid.Mean' # - $4
593-
output: 'ColorProperties.*'
594-
expression: ($1, $2, $3, $4)
595-
```
505+
| Input | Output | Expression |
506+
|-------|--------|------------|
507+
| `*.Max` ($1), `*.Min` ($2), `*.Mid.Avg` ($3), `*.Mid.Mean` ($4) | `ColorProperties.*` | `($1, $2, $3, $4)` |
596508

597509
This revised mapping accurately captures the necessary fields. It correctly specifies the paths to include the nested `Mid` object, which ensures that the asterisks work effectively across different levels of the JSON structure.
598510

599511
### Specialization and second rules
600512

601513
When you use the previous example from multi-input wildcards, consider the following mappings that generate two derived values for each property:
602514

603-
```yaml
604-
- inputs:
605-
- '*.Max' # - $1
606-
- '*.Min' # - $2
607-
output: 'ColorProperties.*.Avg'
608-
expression: ($1 + $2) / 2
609-
610-
- inputs:
611-
- '*.Max' # - $1
612-
- '*.Min' # - $2
613-
output: 'ColorProperties.*.Diff'
614-
expression: $1 - $2
615-
```
515+
| Input | Output | Expression |
516+
|-------|--------|------------|
517+
| `*.Max` ($1), `*.Min` ($2) | `ColorProperties.*.Avg` | `($1 + $2) / 2` |
518+
| `*.Max` ($1), `*.Min` ($2) | `ColorProperties.*.Diff` | `$1 - $2` |
616519

617520
This mapping is intended to create two separate calculations (`Avg` and `Diff`) for each property under `ColorProperties`. This example shows the result:
618521

@@ -639,19 +542,10 @@ Here, the second mapping definition on the same inputs acts as a *second rule* f
639542

640543
Now, consider a scenario where a specific field needs a different calculation:
641544

642-
```yaml
643-
- inputs:
644-
- '*.Max' # - $1
645-
- '*.Min' # - $2
646-
output: 'ColorProperties.*'
647-
expression: ($1 + $2) / 2
648-
649-
- inputs:
650-
- Opacity.Max # - $1
651-
- Opacity.Min # - $2
652-
output: ColorProperties.OpacityAdjusted
653-
expression: ($1 + $2 + 1.32) / 2
654-
```
545+
| Input | Output | Expression |
546+
|-------|--------|------------|
547+
| `*.Max` ($1), `*.Min` ($2) | `ColorProperties.*` | `($1 + $2) / 2` |
548+
| `Opacity.Max` ($1), `Opacity.Min` ($2) | `ColorProperties.OpacityAdjusted` | `($1 + $2 + 1.32) / 2` |
655549

656550
In this case, the `Opacity` field has a unique calculation. Two options to handle this overlapping scenario are:
657551

@@ -660,20 +554,12 @@ In this case, the `Opacity` field has a unique calculation. Two options to handl
660554

661555
Consider a special case for the same fields to help decide the right action:
662556

663-
```yaml
664-
- inputs:
665-
- '*.Max' # - $1
666-
- '*.Min' # - $2
667-
output: 'ColorProperties.*'
668-
expression: ($1 + $2) / 2
669-
670-
- inputs:
671-
- Opacity.Max
672-
- Opacity.Min
673-
output:
674-
```
557+
| Input | Output | Expression |
558+
|-------|--------|------------|
559+
| `*.Max` ($1), `*.Min` ($2) | `ColorProperties.*` | `($1 + $2) / 2` |
560+
| `Opacity.Max`, `Opacity.Min` | *(empty)* | |
675561

676-
An empty `output` field in the second definition implies not writing the fields in the output record (effectively removing `Opacity`). This setup is more of a `Specialization` than a `Second Rule`.
562+
An empty output field in the second definition implies not writing the fields in the output record (effectively removing `Opacity`). This setup is more of a `Specialization` than a `Second Rule`.
677563

678564
Resolution of overlapping mappings by data flows:
679565

@@ -699,19 +585,15 @@ Contextualization datasets can be used with wildcards. Consider a dataset named
699585

700586
In an earlier example, we used a specific field from this dataset:
701587

702-
```yaml
703-
- inputs:
704-
- $context(position).BaseSalary
705-
output: Employment.BaseSalary
706-
```
588+
| Input | Output |
589+
|-------|--------|
590+
| `$context(position).BaseSalary` | `Employment.BaseSalary` |
707591

708592
This mapping copies `BaseSalary` from the context dataset directly into the `Employment` section of the output record. If you want to automate the process and include all fields from the `position` dataset into the `Employment` section, you can use wildcards:
709593

710-
```yaml
711-
- inputs:
712-
- '$context(position).*'
713-
output: 'Employment.*'
714-
```
594+
| Input | Output |
595+
|-------|--------|
596+
| `$context(position).*` | `Employment.*` |
715597

716598
This configuration allows for a dynamic mapping where every field within the `position` dataset is copied into the `Employment` section of the output record:
717599

0 commit comments

Comments
 (0)