This guide explains how to select machine groups, view sensor data, and understand metadata in the Data Loading page.
The Data Loading page is your starting point for working with sensor data. It allows you to:
- Select machine groups (tables) that contain your sensor data
- View raw sensor data from the database
- View preprocessed/aggregated sensor data
- Understand sensor metadata and thresholds
- Check aggregation frequency settings
- Open the application at http://localhost:5173
- Click Data Loading in the sidebar or home page
- You'll see two tabs:
- Load from Database - Select and view existing data
- Upload Data - Import new data (see Data Import Guide)
- In the Load from Database tab, find the Select Machine Group dropdown
- Click the dropdown to see available machine groups
- Select the machine group you want to analyze
Available tables:
- Tables are created when you import data (see Data Import Guide)
- Table names correspond to machine group identifiers
- Example:
KT2201,K3301,K5700
After selecting a table, the system automatically loads:
- Aggregation Frequency: How often data is aggregated
- Available Sensors: List of sensors in the selected machine group
- Raw Data: Original sensor readings
- Preprocessed Data: Aggregated/processed sensor data
Raw data shows the original sensor readings as imported:
- Timestamp: Time of each reading
- Sensor Columns: One column per sensor tag
- Values: Actual sensor readings at each timestamp
- Missing Values: Shown as empty cells or
NaN
When to use:
- Viewing original data before processing
- Checking data completeness
- Verifying import accuracy
Viewing Raw Data:
- Select a machine group
- Click Show Raw Data toggle
- Data appears in a table format
- Use search to filter by sensor or timestamp
Preprocessed data shows aggregated sensor readings:
- Aggregated Values: Data processed by aggregation rules
- Reduced Frequency: Data sampled at aggregation intervals
- Applied Rules: Min, max, avg, or sum based on sensor configuration
When to use:
- Analyzing trends over time
- Reducing data volume for visualization
- Working with aggregated insights
Viewing Preprocessed Data:
- Select a machine group
- Preprocessed data loads automatically
- Click Show Preprocessed Data to view
- Data appears in table format
The aggregation frequency indicates how often sensor data is aggregated:
- Display: Shown in seconds (e.g.,
3600= 1 hour) - Purpose: Determines time intervals for aggregated data
- Configuration: Set during worker service configuration
Understanding the value:
3600seconds = 1 hour aggregation window600seconds = 10 minute aggregation window- Lower values = more frequent aggregation = more data points
Note: Aggregation frequency affects the granularity of preprocessed data and visualizations.
Sensor metadata provides information about each sensor:
- Select a machine group
- Click Show Tags toggle
- View metadata table with sensor information
Each sensor has the following metadata:
- TAG: Unique sensor identifier (e.g.,
22PI102) - Tag Description: Human-readable description
- Machine Group: Machine group identifier
- Low Threshold: Lower limit for valid readings
- High Threshold: Upper limit for valid readings
- Threshold Type:
Up,Down, orUp/Down - Aggregation Rule: How data is aggregated (
min,max,avg,sum) - Engineering Units: Measurement units (e.g.,
Kgf/cm2,degC) - Category: Sensor category (e.g.,
Pressure,Temperature)
Metadata helps you:
- Understand what each sensor measures
- Know valid value ranges (thresholds)
- Understand how data is processed
- Identify sensor categories
Search Raw Data:
- Use the search box to filter by sensor tag or timestamp
- Searches across all columns
- Real-time filtering as you type
Search Preprocessed Data:
- Similar search functionality
- Filters aggregated data table
Export data for external analysis:
- Select your data view (raw or preprocessed)
- Click Export Data button (if available)
- Data downloads as CSV file
Note: Export functionality may vary based on data size and browser capabilities.
The Data Loading page allows you to choose data sources:
- Database: Load from TimescaleDB (default)
- Upload: Import new data files (see Data Import Guide)
- Format:
YYYY-MM-DD HH:MM:SSor ISO format - Purpose: Time reference for all sensor readings
- Required: First column in raw data
- Naming: Matches sensor TAG from metadata
- Values: Numeric sensor readings
- Missing: Empty or
NaNfor missing values - Units: Defined in metadata (Engineering Units)
- Each row represents one time point
- Contains timestamp and all sensor readings
- Rows are ordered chronologically
- Start with Preprocessed Data: Easier to work with for initial analysis
- Check Aggregation Frequency: Understand data granularity
- Review Metadata: Know sensor thresholds and units
- Use Search: Quickly find specific sensors or time ranges
- Verify Data Completeness: Check for missing values in raw data
- Select machine group
- Use search to filter by sensor tag
- Or navigate to Data Visualization to select specific sensors
- View raw or preprocessed data
- Check first and last timestamps
- Verify date range matches expectations
- Click Show Tags to view metadata
- Find sensor of interest
- Check Low/High Threshold values
- Note Threshold Type (Up/Down/Up-Down)
After loading your data:
- Visualize: Go to Data Visualization Guide to create charts
- Check Quality: Use Data Quality Guide for quality assessment
- Find Missing Values: See Missing Values Guide
- Detect Invalid Values: Check Invalid Values Guide
Problem: Dropdown is empty Solutions:
- Import data first (see Data Import Guide)
- Check database connection
- Verify backend service is running
Problem: Selected table but no data appears Solutions:
- Check backend logs:
docker-compose logs backend - Verify table exists in database
- Check for error messages in UI
- Refresh the page
Problem: Expected sensors not visible Solutions:
- Verify sensors were imported (check import logs)
- Check sensor tags match metadata
- Review data import process
- Data Import Guide - Import new sensor data
- Data Visualization Guide - Create visualizations
- Data Quality Guide - Assess data quality
For technical details, see the Backend API Documentation.