This guide will help you set up and start using the IIoT Data Quality Assessment Service for the first time.
Before you begin, ensure you have the following installed on your system:
- Docker and Docker Compose - Required for running the application services
- Node.js 18+ - Required for running the frontend development server
- Git - For cloning the repository (if needed)
Note: Python is not required for end users as the backend runs in Docker containers.
If you haven't already, clone the repository:
git clone <repository-url>
cd fame-data-quality-assessmentCreate a .env file from the example template:
cp env.example .envEdit the .env file and add your OpenAI API key (required for the DQA Agent feature):
OPENAI_API_KEY=your_openai_api_key_hereNote: The DQA Agent feature requires a valid OpenAI API key. Other features will work without it.
Run the startup script:
./start-dev.shThis script will:
- Check that Docker is running
- Create the
.envfile if it doesn't exist - Build and start all required services:
- TimescaleDB database
- DQA Worker service (background data aggregation)
- FastAPI Backend API
- Wait for all services to be ready
- Start the frontend development server
Once all services are running, you can access:
- Frontend Application: http://localhost:5173
- Backend API: http://localhost:8000
- API Documentation: http://localhost:8000/docs
After starting the application, you should see a confirmation message indicating all services are ready. If any service fails to start, check the troubleshooting guide.
Open your web browser and go to http://localhost:5173. You should see the home page with cards for each feature:
- Data Loading - Select and view sensor data
- Data Visualization - Explore data through charts
- Missing Values - Analyze missing data
- Invalid Values - Detect invalid readings
- Data Quality - Comprehensive quality assessment
- DQA Agent - AI-powered chat assistant
Before you can analyze data, you need to import sensor data:
- Navigate to Data Loading from the home page
- Click on the Upload Data tab
- Follow the instructions in the Data Import Guide
Once data is imported:
- Select a machine group from the dropdown in Data Loading
- View raw or preprocessed sensor data
- Navigate to Data Visualization to see charts and analytics
- Check Data Quality for comprehensive quality metrics
The application consists of several interconnected services:
- Modern web interface running on port 5173
- Provides all user interactions and visualizations
- Hot-reload enabled for development
- RESTful API running on port 8000
- Handles data processing, analytics, and import operations
- Provides API documentation at
/docs
- PostgreSQL with TimescaleDB extension
- Stores raw sensor data and aggregated insights
- Optimized for time-series data
- Background service for data aggregation
- Automatically processes and aggregates sensor data
- Runs continuously in the background
To stop all services:
./stop-dev.shOr manually:
docker-compose downNow that you have the application running:
- Import Data: Follow the Data Import Guide to upload your sensor data
- Load Data: Use the Data Loading Guide to select and view your data
- Visualize: Explore the Data Visualization Guide for charts and analytics
- Assess Quality: Check the Data Quality Guide for comprehensive assessment
If you encounter issues during setup:
- Check the Troubleshooting Guide for common problems
- Verify Docker is running:
docker info - Check service logs:
docker-compose logs - Ensure ports 5173, 8000, and 5432 are not in use by other applications
- Data Import Guide - Upload and import sensor data
- Data Loading Guide - Select and view data
- Main Documentation Index - Overview of all documentation
For technical details and development information, see the main README.