Autonomous Mobile Robots (AMRs) offer transformative potential in industrial, agricultural, and educational sectors where low-cost, scalable automation is critical. This study presents ROS Ranger Mini, a compact ROS-enabled AMR that integrates a depth camera, ultrasonic sensors, and radar for robust environment perception and obstacle avoidance. The system combines a Raspberry Pi for high-level processing with the Particle Photon 2 development board, which enables dual operational modes—remote control and autonomous navigation—via Wi-Fi. Sensor and status data are transmitted and stored in real time using Particle Cloud, enabling remote monitoring and system analysis. The design underscores how accessible technologies, when integrated thoughtfully, can create adaptable and efficient robotic platforms for a variety of field applications.
🔵Introducing Particle CloudParticle Cloud is an IoT platform that enables secure device connectivity, real-time data storage, and remote control over the internet. This project facilitates seamless communication between the AMR and a remote user interface.
We use the Particle Photon 2, a powerful Wi-Fi-enabled microcontroller that integrates easily with Particle Cloud. Key specifications include:
Processor: 200 MHz ARM Cortex-M33 (Realtek RTL8721DM)
- Processor: 200 MHz ARM Cortex-M33 (Realtek RTL8721DM)
- Wi-Fi: 802.11 b/g/n, 2.4 GHz
- RAM: 2 MB
- Flash: 4 MB
- GPIOs: 20+ configurable pins
Secure Boot and OTA Updates support
This board handles real-time data transmission, enabling both autonomous and remote control functionalities.
🟢Hardware SectionIn this section, all connections between the motor driver, actuators, camera, ultrasonic sensors, Particle Photon 2, and Raspberry Pi are established to enable communication and feedback exchange. I have also included a 5A step-down booster in the circuit because we usually use a 2S LiPo battery or a 2S 18650 battery, which generates 7.4 volts. Since the Photon 2 can only accept up to 5.5 volts, as specified in the datasheet, the step-down booster is necessary to regulate the voltage safely.
Here is the circuit diagram I designed for this bot using Fusion EDA.
The circuit has three main sections: the Particle Photon 2, a Raspberry Pi HAT, and a 5–15V to 5V step-down voltage converter. The camera is connected to the Raspberry Pi via either a USB port or a 15-pin CSI connector. This diagram also includes two ultrasonic sensors that replicate the function of a LiDAR sensor, helping to reduce the overall cost of the bot. The Photon 2's D1 and D2 pins are connected to the Raspberry Pi for communication, sending acknowledgements, and receiving feedback.
🟡Integration of Photon 2First, you need to register on the Particle website. After that, you can check whether your device is connected. My device is showing up on their platform.
To integrate the Photon, you first need to download VS Code and install the Particle CLI for the command line. Then, you can use the terminal in VS Code to send commands directly.
While programming with the Particle Photon 2, keep in mind that it has multiple modes. It's important to understand the meanings of the RGB LED indicators.
🟠Standard RGB LED Modes on Particle PhotonThen, we have two options: writing code in VS Code or using the Particle Web IDE. Initially, I used VS Code but encountered some issues, so I decided to switch to the Particle Web IDE. It has a very easy-to-understand user interface. I simply wrote the code, and since the Web IDE is already connected to the Particle Cloud, there's no need to write any additional code for connecting to Wi-Fi or the cloud.
MIT App Inventor is a free, beginner-friendly platform developed by MIT for creating mobile apps using a visual, block-based programming interface.
Used for:
- Building Android apps without coding
- Prototyping IoT interfaces
- Educational projects and rapid app development
Here's a short step-by-step guide to invent a simple app using MIT App Inventor:
- Visitappinventor.mit.edu and sign in with your Google account.
- Click "Create Apps" → start a new project.
- Design UI by dragging components (buttons, labels, etc.) to the phone screen.
- Go to the Blocks section → create logic by connecting code blocks.
- Connect the phone via USB or Wi-Fi using the MIT AI2 Companion app.
- Test your app live on your phone.
- Export or build an APK to install or share the app.
After completing the design, you can proceed to the Blocks section. All the backend logic is handled through a block-based system, which acts as an intermediary between your app and the hardware.
The pink blocks in the image are the Text blocks in MIT App Inventor, which contain the URL and the POST data for your web requests.
The URL includes your device ID (0a10aced20219...
).
The POST text contains your access token (b648206178aff3a71dcdd8e3f3...
) along with some command arguments (arg=right
, arg=left
, etc.).
This token and device ID are unique identifiers that allow your app to communicate with your specific device via the Particle API.
Example:
I used the following URL in the app's URL section:https://api.particle.io/v1/devices/0a10aced202194944a056838/motor
This is my device ID for Photon2: "0a10aced202194944a056838
" in the above URL
For each button in the app, I used the corresponding arguments:
- Forward:
arg=forward&access_token=b648206178aff3a71dcdd8e3f32db6b99dac6d98
- Backward:
arg=backward&access_token=b648206178aff3a71dcdd8e3f32db6b99dac6d98
- Left:
arg=left&access_token=b648206178aff3a71dcdd8e3f32db6b99dac6d98
- Right:
arg=right&access_token=b648206178aff3a71dcdd8e3f32db6b99dac6d98
- Stop:
arg=stop&access_token=b648206178aff3a71dcdd8e3f32db6b99dac6d98
For every button, we use a token ID. This is the token ID for my device:b648206178aff3a71dcdd8e3f32db6b99dac6d98
So
the question is, where can you find your token ID and device ID
And for the token ID go to particle CLI and type
particle login
Please enter the email and password you used to register with Particle.
particle token list
And here we have the token
And we have completed one section of the AMR
And here's what it looks like after all that!!
🟣Camera and Ultrasonic Sensor Integration with ROSThis describes how a pair of HC-SR04 ultrasonic sensors mounted on a servo motor are used with ROS 2 to perform 2D environment mapping. The system collects distance data at various angles and visualizes it in RViz.
System Overview
The Arduino rotates the ultrasonic sensors using a servo motor from 0° to 180° in small steps (e.g., every 5°).
At each angle, both ultrasonic sensors measure distances.
The Arduino sends a line of data over serial in the format:
angle:distance_left:distance_right
A ROS 2 node running on the Pi:
Reads this serial data.
Publishes sensor readings as ROS messages.
Optionally transforms them into 2D Cartesian coordinates.
The data is visualized in RViz to build a 2D occupancy map.
Arduino Code Responsibilities
Sweep servo motor from 0° to 180°.
Trigger ultrasonic sensors and read distances.
Send angle and distances via Serial.print() every X milliseconds.
ROS 2 Node Responsibilities-
Read serial data using pyserial.
Parse incoming strings.
Publish data using sensor_msgs/msg/PointCloud2 or a custom message (e.g., Point2D[]).
Publish points to /ultrasonic_scan or similar topic.
Visualization in RViz
To visualize:
Launch RViz with a config that subscribes to your scan topic.
PointCloud2 for scattered points.
MarkerArray for visualization markers.
ROS 2 Packages
1. serial_comms
Purpose: Reads and parses serial data from Arduino (distance, angle)
Node: serial_reader_node
Publishes:
/sensor_data (custom_msgs/SensorData)
Dependencies: rclpy, pyserial
2. sensor_fusion_node
Purpose: Fuses ultrasonic range, servo angles, and odometry data
Node: fusion_node
Subscribes:
/sensor_data
/odom
Publishes:
/fused_scan (sensor_msgs/LaserScan)
/amr_pose (geometry_msgs/PoseStamped)
3. mapping_visualizer
Purpose: Visualizes fused data in RViz2
Node: rviz_mapper
Subscribes:
/fused_scan
/amr_pose
4. teleop_interface
Purpose: Interfaces with Particle Photon board for cloud commands
Node: teleop_bridge_node
Publishes:
/cmd_vel (geometry_msgs/Twist)
Features:
Particle Cloud REST integration
Emergency stop command
Safe-mode fallback
System Workflow
Arduino sends distance + servo angle via serial.
serial_comms node reads it and publishes to /sensor_data.
sensor_fusion_node combines this with odometry/IMU and publishes a pseudo-scan as /fused_scan.
mapping_visualizer displays this in RViz2.
teleop_interface listens to Particle Cloud and issues /cmd_vel to control robot motion.
🔵Product Video/////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////
Comments