Skip to content

Commit 794d575

Browse files
authored
Merge pull request #53357 from spelluru/intro-to-adx
Reinstating training module - Introduction to Azure Data Explorer
2 parents bd1ffcb + ad1110a commit 794d575

16 files changed

Lines changed: 440 additions & 40 deletions

.openpublishing.redirection.json

Lines changed: 0 additions & 35 deletions
Original file line numberDiff line numberDiff line change
@@ -409,41 +409,6 @@
409409
"redirect_url": "https://learn.microsoft.com/training/modules/monitor-azure-vm-using-diagnostic-data",
410410
"redirect_document_id": false
411411
},
412-
{
413-
"source_path_from_root": "/learn-pr/azure/intro-to-azure-data-explorer/1-introduction.yml",
414-
"redirect_url": "https://learn.microsoft.com/training/modules/upload-download-and-manage-data-with-azure-storage-explorer",
415-
"redirect_document_id": false
416-
},
417-
{
418-
"source_path_from_root": "/learn-pr/azure/intro-to-azure-data-explorer/2-what-is-azure-data-explorer.yml",
419-
"redirect_url": "https://learn.microsoft.com/training/modules/upload-download-and-manage-data-with-azure-storage-explorer",
420-
"redirect_document_id": false
421-
},
422-
{
423-
"source_path_from_root": "/learn-pr/azure/intro-to-azure-data-explorer/3-how-azure-data-explorer-works.yml",
424-
"redirect_url": "https://learn.microsoft.com/training/modules/upload-download-and-manage-data-with-azure-storage-explorer",
425-
"redirect_document_id": false
426-
},
427-
{
428-
"source_path_from_root": "/learn-pr/azure/intro-to-azure-data-explorer/4-when-to-use-azure-data-explorer.yml",
429-
"redirect_url": "https://learn.microsoft.com/training/modules/upload-download-and-manage-data-with-azure-storage-explorer",
430-
"redirect_document_id": false
431-
},
432-
{
433-
"source_path_from_root": "/learn-pr/azure/intro-to-azure-data-explorer/5-knowledge-check.yml",
434-
"redirect_url": "https://learn.microsoft.com/training/modules/upload-download-and-manage-data-with-azure-storage-explorer",
435-
"redirect_document_id": false
436-
},
437-
{
438-
"source_path_from_root": "/learn-pr/azure/intro-to-azure-data-explorer/6-summary.yml",
439-
"redirect_url": "https://learn.microsoft.com/training/modules/upload-download-and-manage-data-with-azure-storage-explorer",
440-
"redirect_document_id": false
441-
},
442-
{
443-
"source_path_from_root": "/learn-pr/azure/intro-to-azure-data-explorer/index.yml",
444-
"redirect_url": "https://learn.microsoft.com/training/modules/upload-download-and-manage-data-with-azure-storage-explorer",
445-
"redirect_document_id": false
446-
},
447412
{
448413
"source_path_from_root": "/learn-pr/achievements/learn.azure.deploy-java-spring-boot-app-service-mysql.badge.yml",
449414
"redirect_url": "https://learn.microsoft.com/azure/app-service/tutorial-java-spring-cosmosdb",

learn-pr/achievements.yml

Lines changed: 0 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -40,11 +40,6 @@ achievements:
4040
title: Implement message-based communication workflows with Azure Service Bus
4141
summary: Write C# code in a custom application that sends and receives messages using Azure Service Bus topics and queues.
4242
iconUrl: /training/achievements/implement-message-workflows-with-service-bus.svg
43-
- uid: learn.intro-to-azure-data-explorer.badge
44-
type: badge
45-
title: Introduction to Azure Data Explorer
46-
summary: Learn to describe the ingestion, query, visualization, and data management features that Azure Data Explorer provides to help you make sense of the data flowing into your business.
47-
iconUrl: /training/achievements/intro-to-azure-data-explorer.svg
4843
- uid: learn.azure.intro-to-azure-monitor.badge
4944
type: badge
5045
title: Introduction to Azure Monitor
Lines changed: 15 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,15 @@
1+
### YamlMime:ModuleUnit
2+
uid: learn.intro-to-azure-data-explorer.introduction
3+
title: Introduction
4+
metadata:
5+
unitType: introduction
6+
title: Introduction to Azure Data Explorer
7+
description: Evaluate whether Azure Data Explorer is appropriate to process and analyze your big data.
8+
ms.date: 02/09/2026
9+
author: uribarash
10+
ms.author: urib
11+
ms.topic: unit
12+
durationInMinutes: 3
13+
content: |
14+
[!include[](includes/1-introduction.md)]
15+
Lines changed: 15 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,15 @@
1+
### YamlMime:ModuleUnit
2+
uid: learn.intro-to-azure-data-explorer.what-is-azure-data-explorer
3+
title: What is Azure Data Explorer?
4+
metadata:
5+
unitType: learning-content
6+
title: Introduction to Azure Data Explorer
7+
description: Evaluate whether Azure Data Explorer is appropriate to process and analyze your big data.
8+
ms.date: 02/09/2026
9+
author: uribarash
10+
ms.author: urib
11+
ms.topic: unit
12+
durationInMinutes: 5
13+
content: |
14+
[!include[](includes/2-what-is-azure-data-explorer.md)]
15+
Lines changed: 15 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,15 @@
1+
### YamlMime:ModuleUnit
2+
uid: learn.intro-to-azure-data-explorer.how-azure-data-explorer-works
3+
title: How Azure Data Explorer works
4+
metadata:
5+
unitType: learning-content
6+
title: Introduction to Azure Data Explorer
7+
description: Evaluate whether Azure Data Explorer is appropriate to process and analyze your big data.
8+
ms.date: 02/09/2026
9+
author: uribarash
10+
ms.author: urib
11+
ms.topic: unit
12+
durationInMinutes: 10
13+
content: |
14+
[!include[](includes/3-how-azure-data-explorer-works.md)]
15+
Lines changed: 15 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,15 @@
1+
### YamlMime:ModuleUnit
2+
uid: learn.intro-to-azure-data-explorer.when-to-use-azure-data-explorer
3+
title: When to use Azure Data Explorer
4+
metadata:
5+
unitType: learning-content
6+
title: Introduction to Azure Data Explorer
7+
description: Evaluate whether Azure Data Explorer is appropriate to process and analyze your big data.
8+
ms.date: 02/09/2026
9+
author: uribarash
10+
ms.author: urib
11+
ms.topic: unit
12+
durationInMinutes: 5
13+
content: |
14+
[!include[](includes/4-when-to-use-azure-data-explorer.md)]
15+
Lines changed: 85 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,85 @@
1+
### YamlMime:ModuleUnit
2+
uid: learn.intro-to-azure-data-explorer.knowledge-check
3+
title: Module assessment
4+
metadata:
5+
unitType: knowledge_check
6+
title: Module assessment
7+
description: "Evaluate whether Azure Data Explorer is appropriate to process and analyze your big data."
8+
ms.date: 02/09/2026
9+
author: uribarash
10+
ms.author: urib
11+
ms.topic: unit
12+
module_assessment: true
13+
durationInMinutes: 3
14+
content: |
15+
###########################################################################
16+
###
17+
### General guidance (https://review.learn.microsoft.com/learn-docs/docs/id-guidance-knowledge-check)
18+
###  - Questions are complete sentences ending with a question mark
19+
###  - No true/false questions
20+
###  - 3 answers per question
21+
###  - All answers about the same length
22+
###  - Numeric answers listed in sorted order
23+
###  - No "All of the above" and/or "None of the above" as answer choices
24+
###  - No "Not" or "Except" in questions
25+
###  - No second person ("you") in the questions or answers
26+
###  - Provide a meaningful explanation for both correct and incorrect answers
27+
###
28+
### Question content requirements:
29+
### - Write 5 questions
30+
### - Questions 1,2 must test this Learning Objective: "Describe how <attributes> of <product> work to <solve problem>"
31+
### Guidance: These two questions can be short, no need for a long scenario to analyze. Test if they understand how the product works.
32+
### Example: "What differentiates an action from a control action in an Azure Logic App?"
33+
### - Questions 3,4,5 must test this Learning Objective: "Evaluate whether <product> is appropriate to <general product use case>"
34+
### Guidance: Use scenario questions that ask the learner to analyze a situation with the "when to use" criteria presented in the module.
35+
### Example: "Suppose you work for a financial company. You're building a system to let your brokers trade financial instruments. Your system must monitor market conditions, detect changes, and execute trades. You'll need to handle a large volume of transactions and you'll need to do it quickly. The faster you can complete trades, the more of an advantage you'll have over your competitors. Which requirement of this system would be difficult for Logic Apps to satisfy?"
36+
###
37+
###########################################################################
38+
quiz:
39+
questions:
40+
- content: "What best describes the Azure Data Explorer service?"
41+
choices:
42+
- content: "Big data storage data optimized for batching non-interactive queries."
43+
isCorrect: false
44+
explanation: "Azure Data Explorer is optimized for exploratory, interactive queries. It's suitable for both batching and streaming data."
45+
- content: "Big data solution for data transformations."
46+
isCorrect: false
47+
explanation: "Azure Data Explorer is designed for data queries, not data transformations"
48+
- content: "Big data analytics cloud platform optimized for high volume, interactive queries."
49+
isCorrect: true
50+
explanation: "Azure Data Explorer is a big data analytics platform that makes it easy to analyze high volumes of data in near real-time."
51+
- content: "What is a control command in Azure Data Explorer?"
52+
choices:
53+
- content: "Control commands can be used for maintenance and policy tasks."
54+
isCorrect: true
55+
explanation: "Control commands include the creation of new clusters or databases, data connections, auto scaling, cluster configurations, entities, metadata objects, and security policies."
56+
- content: "A query."
57+
isCorrect: false
58+
explanation: "Queries are used for interactive data analytics."
59+
- content: "A way to visualize data."
60+
isCorrect: false
61+
explanation: "Data can be visualized with the native dashboard experience, or with connectors to external services such as Power BI and Grafana."
62+
- content: "Which of the following tasks is an appropriate use case for Azure Data Explorer?"
63+
choices:
64+
- content: "Log and Text analytics"
65+
isCorrect: true
66+
explanation: "Log and Text analytics is an excellent use case for Azure Data Explorer."
67+
- content: "In-memory batching processing of petabytes of information."
68+
isCorrect: false
69+
explanation: "Azure Data Explorer is optimized to use indexed data for fast analytics response."
70+
- content: "Real-time analytics"
71+
isCorrect: false
72+
explanation: "Azure Data Explorer shouldn't be used for real-time analytics. It's used for NEAR real-time analytics."
73+
- content: "Which aspect of an extract, transform, and load (ETL) process makes it unsuitable for Azure Data Explorer. "
74+
choices:
75+
- content: "Long running tasks."
76+
isCorrect: true
77+
explanation: "Azure Data Explorer isn't optimized for long running tasks. It isn't recommended that you run massive nightly ETL jobs on Azure Data Explorer."
78+
- content: "Large volumes of data for ingestion."
79+
isCorrect: false
80+
explanation: "Azure Data Explorer is built as a big data analytics platform."
81+
- content: "High query concurrency."
82+
isCorrect: false
83+
explanation: "Azure Data Explorer is optimized for high query concurrency."
84+
85+
Lines changed: 15 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,15 @@
1+
### YamlMime:ModuleUnit
2+
uid: learn.intro-to-azure-data-explorer.summary
3+
title: Summary
4+
metadata:
5+
unitType: summary
6+
title: Introduction to Azure Data Explorer
7+
description: Evaluate whether Azure Data Explorer is appropriate to process and analyze your big data.
8+
ms.date: 02/09/2026
9+
author: uribarash
10+
ms.author: urib
11+
ms.topic: unit
12+
durationInMinutes: 3
13+
content: |
14+
[!include[](includes/6-summary.md)]
15+
Lines changed: 30 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,30 @@
1+
Daily operations and interactions with customers create a constant flow of data. This world of big data is growing steadily, and so is the need to store, process, and analyze the data in a timely and cost-efficient way. Big data requires large amounts of scalable storage space. Because huge volumes of data flow in at high velocity from various sources, the ability to identify and respond to meaningful events is key. Additionally, data is generated in various formats like structured and semi-structured data and free text, as well as images and videos. In order to find correlations between these different data flows, businesses invest significant time and money into parsing, processing, and storing this data. A robust end-to-end data analytics system that can manage your huge, complex data and run advanced analytics is essential to make data-driven business decisions. What tool can help you manage this vast array of data types, work flows, and visualizations?
2+
3+
> [!VIDEO https://learn-video.azurefd.net/vod/player?id=02355c8a-a1f6-4740-ac9a-8d5c876e7ea7]
4+
5+
Azure Data Explorer is a fully managed, high-performance, and big data analytics platform. Azure Data Explorer can take all this varied data, and then ingest, process, and store it. You can use Azure Data Explorer for near real-time queries and advanced analytics, as well as for more advanced features such as geospatial analytics, alerting, dashboarding, and business analytics.
6+
7+
## Example scenario
8+
9+
Imagine you work at a clothing company that is a large chain of brick-and-mortar stores that's expanding into e-commerce. You're about to launch your end of year sale targeting several international audiences. You want to see how your campaign impacts sales, inventory, and logistics. You have a large volume of data flowing in different formats, and need to figure out a way to make sense of this data and use it to make good business decisions.
10+
11+
Different divisions across the company are going to use the collected data to inform their strategic and day-to-day decisions on operations, marketing, and customer relations. They plan to use Azure Data Explorer to ingest various data types into a single collection comprised of:
12+
13+
- **Structured data**: Such as internal operations systems.
14+
- **Semi-structured data**: Such as marketing clickstream data.
15+
- **Unstructured data**: Such as social media feeds.
16+
17+
Then, each division can use data analysis and visualization to make data-driven decisions about the campaign.
18+
19+
## What will we be doing?
20+
21+
Analyzing the capabilities of Azure Data Explorer to help you decide when to use it, answering:
22+
23+
- What are the strengths of Azure Data Explorer and the Kusto Query Language?
24+
- How do you work with the service?
25+
- What types of data can you analyze, and where can the data come from?
26+
- How can you organize, display, or make the results of your queries actionable?
27+
28+
## What is the main goal?
29+
30+
By the end of this session, you're able to decide whether Azure Data Explorer is a good choice to help you make sense of your big data.
Lines changed: 53 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,53 @@
1+
Let's start by defining the service and taking a tour of the core features of Azure Data Explorer. This overview should help you decide if it's the right service for you to manage and analyze your data.
2+
3+
## What is Azure Data Explorer?
4+
5+
Azure Data Explorer is a big data analytics platform that makes it easy to analyze high volumes of data in near real time. It allows you to extract key insights, spot patterns and trends, and create forecasting models.
6+
7+
The Azure Data Explorer toolbox gives you an end-to-end solution for data ingestion, query, visualization, and management. These tools allow you to analyze structured, semi-structured, and unstructured data across time series and apply Machine Learning.
8+
9+
Azure Data Explorer is fully managed, scalable, secure, robust, and enterprise-ready. It's useful for log analytics, time series analytics, IoT, and general-purpose exploratory analytics.
10+
11+
## How to understand your big data
12+
13+
If you remember our example clothing company, they have many types of data coming in from varied domains. They need to use different kinds of analytics on these data types, and then share their results with a range of stakeholders. They're going to use Azure Data Explorer to get insights from all of their data across the company.
14+
15+
**Production** analyzes their product logs to manage their inventory and make manufacturing decisions. Geospatial analytics informs these decisions, which are used to identify geographical areas of high-performing ads and anticipate inventory.
16+
17+
The company's warehouses are outfitted with IoT devices, some of which are used by **security** to manage warehouse entry/exit logs. Others are used by operations to monitor the environment inside the warehouse. Individual stores use time series analytics to identify sales anomalies and predict future inventory events.
18+
19+
The global **marketing** team uses clickstream data (also a form of log analytics) to optimize and scan online ad campaigns and the customer funnel. The customer success department uses text search to analyze user feedback on social media.
20+
21+
Every minute of the day, a company decision is being made based on data flowing into Azure Data Explorer.
22+
23+
## What are some of Azure Data Explorer's key features?
24+
25+
Now that you have an idea of what Azure Data Explorer can be used for, let's look at some of its key features.
26+
27+
### Data velocity, variety, and volume
28+
29+
Azure Data Explorer can ingest terabytes of data in minutes in batch or streaming mode. It can query petabytes of data and return results within milliseconds to seconds. This capacity allows for the high velocity (millions of events per second), low latency (seconds), and linear scale ingestion of raw data. This raw data can be ingested in different formats and structures and it can flow in from various pipelines and sources.
30+
31+
### User-friendly query language
32+
33+
Azure Data explorer uses the Kusto Query Language (KQL), an open-source language initially invented by the team. The language is simple to understand and learn, and it's highly productive. You can use simple operators and advanced analytics.
34+
35+
### Advanced analytics
36+
37+
Azure Data Explorer has a large set of functions for time series analysis. Functions include adding and subtracting time series, filtering, regression, seasonality detection, geospatial analysis, anomaly detection, scanning, and forecasting. Time series functions are optimized for processing thousands of time series in seconds. Pattern detection is made easy with cluster plugins that can diagnose anomalies and do root cause analysis. You can also extend Azure Data Explorer capabilities by embedding python code in KQL queries.
38+
39+
### Easy-to-use wizard
40+
41+
The ingestion wizard makes the data ingestion process easy, fast, and intuitive. The web UI provides an intuitive and guided experience that helps customers ramp-up quickly to start ingesting data, creating database tables, and mapping structures. It enables a one time or continuous ingestion from various sources in various data formats. Table mappings and schema are auto suggested and easy to modify.
42+
43+
### Versatile data visualization
44+
45+
Data visualization helps you gain important insights. Azure Data Explorer offers built-in visualization and dashboarding out of the box, with support for various charts and visualizations. It has native integration with Power BI, native connectors for Grafana, Kibana and Databricks, ODBC support for Tableau, Sisense, Qlik, and more.
46+
47+
### Automatic ingest, process, and export
48+
49+
Azure Data Explorer supports server-side stored functions, continuous ingest, and continuous export to Azure Data Lake store. It also supports ingestion time mapping transformations on the server side, update policies, and precomputed scheduled aggregates with materialized views.
50+
51+
### Integration with other services
52+
53+
Integrate easily and seamlessly with other tools in all aspects of your workflow, such as **Ingestion**, **Visualization**, **Orchestration**, and **Monitoring**.

0 commit comments

Comments
 (0)