Autonomous robotics and embedded experimentation with ROS 2, motor control, and AI-assisted ultrasonic calibration experiments.
Overview • Current Capabilities • Main Directories • Getting Started • Documentation • Contributing
This repository contains the ROS 2 workspace, Docker environment, embedded control logic, and calibration-related assets for a 1:14 scale autonomous vehicle prototype developed on Raspberry Pi 5.
The project is part of a broader research-oriented effort focused on multisensory embedded systems for autonomous vehicles, using a scaled platform as an initial validation environment before future migration to larger systems.
At its current stage, the repository supports motor control, keyboard and mobile teleoperation, ultrasonic-based safety logic, MJPEG video streaming, and AI-assisted calibration experiments for distance sensing.
- ROS 2-based motor control for a 1:14 scale vehicle
- Keyboard teleoperation through terminal / SSH
- Mobile teleoperation through WebSocket commands
- Live MJPEG camera streaming
- Ultrasonic obstacle monitoring and emergency-stop behavior
- AI-assisted ultrasonic calibration workflow
- Dockerized development and execution environment on Raspberry Pi 5
ros2_ws/: ROS 2 workspace containing themotor_controllerpackage and runtime nodesmodel_ai_calibration/: datasets, training scripts, exported models, and calibration experimentsDocker/: containerized environment for reproducible setup and deploymentdocs/: extended technical documentation, setup notes, and project references
teleop_motor: basic keyboard teleoperation for motor validationpruebarayo: integrated keyboard control, ultrasonic safety, and AI-corrected validation flowrayows: modular runtime for WebSocket teleoperation, MJPEG streaming, and integrated robot-side services
This project integrates software, embedded systems, electronics, and AI-based calibration technologies for the development of an autonomous 1:14 scale vehicle prototype.
- Python 3.11.9
- Docker
- ROS 2 Humble
- Linux / SSH remote access
- Raspberry Pi 5
- HC-SR04 ultrasonic sensor
- Pi Camera Module 3
- IFM O3D303 ToF sensor (planned integration)
- H-bridge motor driver
- DC motors
- GPIO-based actuator and sensor interfacing
- ROS 2 nodes for motor control and communication
- Keyboard-based teleoperation (WASD over SSH)
- Mobile app-based teleoperation through a joystick interface over WebSocket
- Migration from manual input to sensor-based autonomous input
- PWM-based motor speed control for acceleration, deceleration, and vehicle steering control
- CSV-based dataset generation
- Neural-network-based sensor correction
- Exported .keras and .h5 models
- Runtime calibration asset integration for validation flows
Before starting, make sure you have:
- Docker installed and running
- A Raspberry Pi 5 for GPIO and camera access
- This repository cloned locally
- The required hardware connected properly
Note: This project is intended to run on a Raspberry Pi 5 with hardware access enabled. Some features such as GPIO, camera streaming, and sensor interfacing will not work correctly on a standard desktop environment.
git clone https://github.com/CesarN27/ros2_autonomous_docker.git
cd ros2_autonomous_dockerdocker build -t ros2-autonomous-gpio -f Docker/Dockerfile .docker run -it --rm --privileged \
--network host \
-v $(pwd)/ros2_ws:/ros2_ws \
-v $(pwd)/model_ai_calibration:/ros2_ws/src/sensor_ai \
-v /dev:/dev \
-v /run/udev:/run/udev:ro \
ros2-autonomous-gpioInside the container:
cd /ros2_ws
rm -rf install build log
colcon build
source install/setup.bashMain package: motor_controller
ros2 run motor_controller teleop_motorOther available executables:
ros2 run motor_controller pruebarayo
ros2 run motor_controller rayowsThis repository contains the ROS 2 runtime, embedded control logic, WebSocket bridge, and MJPEG streaming backend for the vehicle.
The mobile control application is maintained in a separate repository:
- Mobile App Repository: RayoMacApp
The external mobile app communicates with this repository through:
- WebSocket control:
ws://<robot-ip>:8765/ - MJPEG video stream:
http://<robot-ip>:8080/stream
Note: The mobile application is not included in this repository. This repository only provides the robot-side services consumed by the app.
flowchart LR
APP[Mobile App / Teleoperation] -->|WebSocket / Commands| PI[Raspberry Pi 5]
HC[HC-SR04] --> PI
CAM[Pi Camera Module 3] --> PI
TOF[IFM O3D303] --> PI
PI --> ROS[ROS 2 Nodes]
ROS --> CTRL[Motor Control Logic]
CTRL --> HBRIDGE[H-Bridge Driver]
HBRIDGE --> MOTORS[DC Motors]
PI --> AI[AI Calibration Assets]
For detailed technical information, see:
The technical documentation includes:
- Hardware setup and wiring
- ROS 2 node descriptions
- Calibration pipeline details
- Validation scope
- Known limitations
- Bill of materials
Validated so far:
- Motor actuation and teleoperation
- HC-SR04 distance acquisition
- Emergency-stop logic
- MJPEG streaming backend
- Calibration dataset generation
- AI model training and export
Under active development:
- Camera integration refinement
- IFM O3D303 integration
- Multisensor fusion
- Autonomous decision layer
- Dockerized ROS 2 development environment
- Basic motor control package
- Keyboard-based teleoperation
- Ultrasonic dataset generation
- AI calibration model training
- Pi Camera Module 3 integration
- IFM O3D303 integration
- Sensor fusion stage
- Autonomous decision layer
- Full perception-validation workflow on vehicle prototype
- The project is hardware-dependent and requires Raspberry Pi GPIO access
- Full multisensor fusion is not yet implemented
- The mobile application is maintained in a separate repository
- Some AI-assisted runtime features depend on external model assets being present
