You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
docker tag mcr.microsoft.com/azureiotoperations/devx-runtime:0.1.8 devx
@@ -78,7 +78,9 @@ This command builds all the operators in the workspace and creates `.wasm` files
78
78
79
79
### Run the graph application locally
80
80
81
-
Press `Ctrl+Shift+P` to open the command palette and search for **Azure IoT Operations: Run Application Graph**. Select **release** as the run mode. This command runs the graph application locally by using the local execution environment with the `graph.dataflow.yaml` file in the workspace.
81
+
To start the local execution environment, press `Ctrl+Shift+P` to open the command palette and search for **Azure IoT Operations: Start Development Environment**. Select **release** as the run mode.
82
+
83
+
When the local execution environment is running, press `Ctrl+Shift+P` to open the command palette and search for **Azure IoT Operations: Run Application Graph**. Select **release** as the run mode. This command runs the graph application locally by using the local execution environment with the `graph.dataflow.yaml` file in the workspace.
82
84
83
85
It also reads from `hostapp.env.list` to set the environment variable `TK_CONFIGURATION_PARAMETERS` for data flow operator configuration parameters.
84
86
@@ -291,6 +293,8 @@ connections:
291
293
292
294
Copy the `data` folder that contains the sample data from the cloned samples repository `explore-iot-operations\samples\wasm\data` to the current workspace. The `data` folder contains three JSON files with sample input temperature data.
293
295
296
+
If you previously stopped the local execution environment, press `Ctrl+Shift+P` to open the command palette and search for **Azure IoT Operations: Start Development Environment**. Select **release** as the run mode.
297
+
294
298
Press `Ctrl+Shift+P` to open the command palette and search for **Azure IoT Operations: Run Application Graph**:
295
299
296
300
1. Select the `graph.dataflow.yaml` graph file.
@@ -352,13 +356,11 @@ You can modify the test data in the `data/` folder to experiment with different
352
356
353
357
### Build and run the state store scenario
354
358
355
-
1. Press `Ctrl+Shift+P` to open the command palette and search for **Azure IoT Operations: Build All Data Flow Operators**.
356
-
357
-
1. Select **release** as the build mode. Wait for the build to complete.
359
+
If you previously stopped the local execution environment, press `Ctrl+Shift+P` to open the command palette and search for **Azure IoT Operations: Start Development Environment**. Select **release** as the run mode.
358
360
359
-
1. Press `Ctrl+Shift+P` again and search for **Azure IoT Operations: Run Application Graph**.
361
+
1. Press `Ctrl+Shift+P` to open the command palette and search for **Azure IoT Operations: Build All Data Flow Operators**. Select **release** as the build mode. Wait for the build to complete.
360
362
361
-
1. Select **release** as the run mode.
363
+
1. Press `Ctrl+Shift+P` again and search for **Azure IoT Operations: Run Application Graph**. Select **release** as the run mode.
362
364
363
365
1. Select the `data` folder in your VS Code workspace for your input data. The DevX container launches to run the graph with the sample input.
364
366
@@ -435,6 +437,8 @@ The `data/` folder contains three test files:
435
437
436
438
### Build and run the schema registry scenario
437
439
440
+
If you previously stopped the local execution environment, press `Ctrl+Shift+P` to open the command palette and search for **Azure IoT Operations: Start Development Environment**. Select **release** as the run mode.
441
+
438
442
1. Press `Ctrl+Shift+P` to open the command palette and search for **Azure IoT Operations: Build All Data Flow Operators**.
439
443
440
444
1. Select **release** as the build mode. Wait for the build to complete.
@@ -459,13 +463,14 @@ After processing completes:
459
463
460
464
This example shows how the schema registry validates incoming messages and filters out messages that don't conform to the defined schema.
461
465
466
+
Stop the local execution environment when you're done testing in release mode by pressing `Ctrl+Shift+P` and searching for **Azure IoT Operations: Stop Development Environment**.
467
+
462
468
## Debug WASM modules using the VS Code extension
463
469
464
470
This example shows you how to debug WASM modules locally using breakpoints and the integrated debugger in VS Code.
465
471
466
472
### Prerequisites
467
473
468
-
469
474
Complete the [Schema registry support for WASM modules](#schema-registry-support-for-wasm-modules) example to set up the sample workspace.
470
475
471
476
### Set up debugging
@@ -491,6 +496,8 @@ Complete the [Schema registry support for WASM modules](#schema-registry-support
491
496
492
497
### Run with debugging enabled
493
498
499
+
Press `Ctrl+Shift+P` to open the command palette and search for **Azure IoT Operations: Start Development Environment**. Select **debug** as the run mode.
500
+
494
501
1. Press `Ctrl+Shift+P` and search for **Azure IoT Operations: Run Application Graph**.
495
502
496
503
1. Select the `lldb-debug.graph.dataflow.yaml` graph file.
@@ -501,15 +508,6 @@ Complete the [Schema registry support for WASM modules](#schema-registry-support
501
508
502
509
1. After the DevX container launches, you see the host-app container start with an `lldb-server` for debugging.
503
510
504
-
You use a special debug graph file because the current debugger can attach to only one custom WASM operator in a debug session. The dedicated debug graph lets you keep your normal `graph.dataflow.yaml` and prevents the debugger from attaching unpredictably when multiple modules exist.
505
-
506
-
To create your own debug graph file:
507
-
508
-
1. Copy your regular `graph.dataflow.yaml` to a new file. Using the `lldb-` prefix is a convention but the name is arbitrary.
509
-
1. Remove all other custom WASM operators except the one you want to debug.
510
-
1. Rebuild the target operator in **debug** mode so the symbols are available.
511
-
1. Run the debug graph with run mode set to **debug**. The extension launches an `lldb-server` and attaches VS Code automatically.
512
-
513
511
### Debug the WASM module
514
512
515
513
1. The execution automatically stops at the breakpoint you set in the `filter` function.
@@ -558,6 +556,8 @@ This debugging capability enables you to troubleshoot issues, understand data fl
558
556
559
557
- **Host app stability**: The local execution environment might occasionally stop working and require restarting to recover.
560
558
559
+
- **Remote debugging limitations**: Currently, you can't remotely debug WASM modules running in Azure Linux 3.0 due to incompatible LLDB versions.
Copy file name to clipboardExpand all lines: articles/stream-analytics/includes/event-generator-app.md
+11-14Lines changed: 11 additions & 14 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -1,16 +1,11 @@
1
1
---
2
-
title: include file
3
-
description: include file
2
+
title: Event generator application
3
+
description: This include file has steps for creating an event hub, event generator application, and Stream Analytics job to analyze the data generated by the event generator application.
4
4
author: spelluru
5
5
ms.service: azure-stream-analytics
6
6
ms.topic: include
7
-
ms.date: 02/14/2024
7
+
ms.date: 02/19/2026
8
8
ms.author: spelluru
9
-
ms.custom:
10
-
- "include file"
11
-
- sfi-image-nochange
12
-
- sfi-ropc-nochange
13
-
14
9
---
15
10
16
11
## Sign in to Azure
@@ -38,10 +33,10 @@ Use the following steps to create an event hub and send call data to that event
38
33
:::image type="content" source="media/event-generator-app/create-event-hub-namespace.png" alt-text="Screenshot showing the Create Namespace page.":::
39
34
1. On the **Review + create** page of the namespace creation wizard, select **Create** at the bottom of the page after reviewing all settings.
40
35
5. After the namespace is deployed successfully, select **Go to resource** to navigate to the **Event Hubs Namespace** page.
41
-
6. On the **Event Hubs Namespace** page, select **+Event Hub** on the command bar.
36
+
6. On the **Event Hubs namespace** page, select **+Event Hub** on the command bar.
42
37
43
38
:::image type="content" source="media/event-generator-app/add-event-hub-button.png" alt-text="Screenshot showing the Add event hub button on the Event Hubs Namespace page.":::
44
-
1. On the **Create Event Hub** page, enter a **Name** for the event hub. Set the **Partition Count** to 2. Use the default options in the remaining settings and select **Review + create**.
39
+
1. On the **Create Event Hub** page, enter a **Name** for the event hub. Set the **Partition Count** to 2. Use the default options in the remaining settings and select **Review + create**.
45
40
46
41
:::image type="content" source="media/event-generator-app/create-event-hub-portal.png" alt-text="Screenshot showing the Create event hub page.":::
47
42
1. On the **Review + create** page, select **Create** at the bottom of the page. Then wait for the deployment to succeed.
@@ -50,7 +45,7 @@ Use the following steps to create an event hub and send call data to that event
50
45
51
46
Before an application can send data to Azure Event Hubs, the event hub must have a policy that allows access. The access policy produces a connection string that includes authorization information.
52
47
53
-
1. On the **Event Hubs Namespace** page, select **Shared access policies** on the left menu.
48
+
1. On the **Event Hubs namespace** page, select **Shared access policies** on the left menu.
54
49
1. Select **RootManageSharedAccessKey** from the list of policies.
55
50
56
51
:::image type="content" source="media/event-generator-app/select-key.png" alt-text="Screenshot that shows the Shared access policies page.":::
@@ -71,7 +66,7 @@ Before you start the TelcoGenerator app, you should configure it to send data to
71
66
2. Open the `TelcoGenerator\TelcoGenerator\telcodatagen.exe.config` file in a text editor of your choice There's more than one `.config` file, so be sure that you open the correct one.
72
67
3. Update the `<appSettings>` element in the config file with the following details:
73
68
74
-
* Set the value of the **EventHubName** key to the value of the **EntityPath** at the end of the connection string.
69
+
* Set the value of the **EventHubName** key to the name of the event hub you created in the previous section.
75
70
* Set the value of the **Microsoft.ServiceBus.ConnectionString** key to the connection string to the namespace. If you use a connection string to an event hub, not a namespace, remove `EntityPath` value (`;EntityPath=myeventhub`) at the end. **Don't forget** to remove the semicolon that precedes the EntityPath value.
76
71
4. Save the file.
77
72
5. Next open a command window and change to the folder where you unzipped the TelcoGenerator application. Then enter the following command:
@@ -101,7 +96,9 @@ Before you start the TelcoGenerator app, you should configure it to send data to
101
96
Now that you have a stream of call events, you can create a Stream Analytics job that reads data from the event hub.
102
97
103
98
1. To create a Stream Analytics job, navigate to the [Azure portal](https://portal.azure.com/).
104
-
2. Select **Create a resource** and search for **Stream Analytics job**. Select the **Stream Analytics job** tile and select **Create**.
99
+
1. Select **All services** in the left menu, search for **Stream Analytics jobs**, hover the mouse over the **Stream Analytics jobs** tile, and then select **+** button or select **Create** in the pop-up window.
100
+
101
+
:::image type="content" source="media/event-generator-app/find-stream-analytics-resource.png" alt-text="Screenshot showing how to find Stream Analytics in the Azure portal.":::
105
102
1. On the **New Stream Analytics job** page, follow these steps:
106
103
1. For **Subscription**, select the subscription that contains the Event Hubs namespace.
107
104
1. For **Resource group**, select the resource group you created earlier.
@@ -130,7 +127,7 @@ The next step is to define an input source for the job to read data using the ev
130
127
1. For **Event hub name**, select the event hub you created in the previous section. All the event hubs available in the selected namespace are listed in the dropdown.
131
128
1. For **Event hub consumer group**, keep the **Create new** option selected so that a new consumer group is created on the event hub. We recommend that you use a distinct consumer group for each Stream Analytics job. If no consumer group is specified, the Stream Analytics job uses the `$Default` consumer group. When a job contains a self-join or has multiple inputs, some inputs later might be read by more than one reader. This situation affects the number of readers in a single consumer group.
132
129
1. For **Authentication mode**, select **Connection string**. It's easier to test the tutorial with this option.
133
-
1. For **Event hub policy name**, select **Use existing**, and then select the policy you created earlier.
130
+
1. For **Event hub policy name**, select **Use existing**, and then select the default policy: **RootManageSharedAccessKey**.
134
131
1. Select **Save** at the bottom of the page.
135
132
136
133
:::image type="content" source="media/event-generator-app/configure-stream-analytics-input.png" alt-text="Screenshot showing the Event Hubs configuration page for an input.":::
0 commit comments