The Final Project for UIUC's SE423 Mechatronics course is to program a robot to travel from a start point to 5 different waypoints (illustrated in Figure1), where the robot must collect orange and blue golf balls and deliver them to their respective goal hopper. Furthermore, based on various location constraints, 2’ X 2’ obstacles are placed randomly in the course, which the robot has to avoid!
Following the premise described above, the final class contest will involve roughly 10 different robots, each with identical hardware but all with different obstacle avoidance systems, ball retrieving strategies and more. The robots will be scored in their overall time to complete the course according to the below criteria:
- A baseline score will be given to robots that travel to all 5 waypoints and collect at least 2 golf balls.
- Modifier 1: 20 second time deduction for each additional golf ball retrieved.
- Modifier 2: 20 second time deduction for each (X,Y) coordinate of a golf ball that is properly marked on the LabView map (Figure2).
- Modifier 3: 80 second time deduction if the robot does not make contact with any wall or obstacle.
Obstacle Avoidance:
To traverse the course without colliding with a wall or obstacle as efficiently as possible, our group decided to implement an A* path finding algorithm, which works in tandem with input LIDAR data. This system works as follows:
- The robot begins by trying to travel to a waypoint.
- The LIDAR detects an obstacle in front of the robot, in which the robot can calculate an approximate location of the obstacle in the world frame. This approximation is calculated with the robot's location (by using the OptiTrack camera system) and the robot's bearing. For an obstacle to count as an obstacle, it must be detected a certain amount of times or tallies (2).
- Assuming an obstacle is counted, our system looks for a location match in the hardcoded list of possible obstacle locations, which makes the system more efficient and reliable. Once this match is found, it is marked on the revisable A* map.
- The new A* map is run through the A* algorithm on the Raspberry Pi. This new path contains a series of waypoints for the robot to go to.
- As the robot follows the new A* path, if it detects more obstacles, this entire process repeats.
While the above system is running, the robot is also able to Right Wall Follow, which serves as an additional obstacle/wall avoidance mechanism (the name is self-explanatory). Whether the robot follows the A* path or utilizes Right Wall Following is controlled by a couple of states/cases of a state machine; if the LIDAR detects that the robot is within .75 feet of an obstacle for 310ms, the robot will switch to Right Wall Following.
Golf Ball Detection and Retrieval
The robot, additionally, needs to be able to collect orange and blue golf balls along the way, deliver them to their respective hoppers and mark their locations on the LabView map. To complete this, the robot uses the OpenMV camera system and a state machine.
The OpenMV camera is able to detect an orange or blue golf ball based on hardcoded color threshold values that we have determined. Once the camera detects an object that lies within the thresholds, an OpenCV script is able to run a blob search algorithm, which creates a blob around an object and computes the centroid of the object. The area of the blobs and the centroids are sent to the robot, where the robot will go into several different states.
The state machine has 4 states for the orange balls and 4 states for the blue balls. Once the robot detects that a ball, where the area of the blob is greater than 100, several states are executed.
- State 20/30: The robot stops and deviates from its current path to go to a detected ball centroid and the tongue is set to the corresponding color position. Once the golf ball is below a set row pixel value, indicating that the robot is close to the ball, the state machine moves to the next state.
- State 22/32: The robot waits 100ms and opens the collector gate. After 1000ms, the robot moves onto the next state.
- State 24/34: The robot moves forward for 500ms so that the ball is in the collection system. After, the collection gate closes and a flag is raised, which is sent to LabView so that the appropriate colored ball is marked on the LabView map. The state machine then goes back to State 1, where the robot continues to travel to the next waypoint.
- State 28/38: These states occur when the robot is within .5 feet of a ball hopper, where the robot will move the tongue to the corresponding color position, open the collector gate, and move so the balls roll into the hopper.
Talking Robot
As an additional feature to our robot, and to just have a little bit of fun, we gave the robot a talking feature. When a specific action is completed, the robot can say different phrases, such as when the robot collects an orange ball, it says "orange ball collected".
To accomplish this, we created a speech flag variable in C that is set to different integers for different phrases, for example, the flag is set to 2 in Case 30, which corresponds to the robot saying "blue ball collected". The flag is sent to LabView though LVCOMApp, which is a Raspberry Pi script that handles with sending data from the robot to LabView, such as robot position and various flag values. In LVCOMApp, we have implemented several if-statements with the speech flag as the conditional, so that if an if statement is true, a specific espeak command is sent to the Raspberry Pi terminal. espeak is a Python speech synthesizer that can be run through the terminal.
The Robot in ActionA video of the robot on the course can be found with this link: https://photos.app.goo.gl/k4vNLtGkioUKCEew8
Our GroupOur group from right to left: Mengxuan (Max) Xie, Conrad Ku, Na-Teng (Nathan) Hung and Willard Sullivan.
Comments