This project involved using a TI F28379D Launchpad and an Orange Pi which were connected using UART. The F28379D controlled the wheel motors and received the data from the two IR sensors. The IR sensors were connected to the green board by soldering a connection from the 5V power supply to each sensor and the output was connected to the TI’s ADC ports. There were also two servo motors used that connected to one of the PWM channels used. The robot has two functionalities. They are to follow a right wall and fire upon a red target. There were also several 3D printed components.
For the IR sensors, the information sent to the 28379D was in the form of voltage. In order for the IR sensors to work properly the 28379D’s hardware interrupts were used to perform analog to digital conversion (ADC). The ADC peripherals can generate interrupts when the conversion is complete and the samples are stored in the result registers of the ADC peripheral. The ADCD peripheral was used for the two IR sensors. The two channels used are ADCIND0 and ADCIND1. These values are then converted to voltages from 0 to 3V. This value originally means a short distance between the IR and an object is a 3V result. This is inverted to apply the right wall portion of the code. The right wall following code, in basic terms, runs by having the robot maintain a specified distance from the wall and if it senses a wall ahead it will turn left to avoid it.
The servos and the wheel motors were controlled using the 28379D’s pulse width modulation (PWM) capabilities. EPWM2 was used for the wheels and EPWM8 was used for the servos. It was crucial for this portion that the code was run at a specific time or the wheels and servos would not function properly. Therefore, the code was placed within one of the interrupts running at a specific time. The servos worked by controlling the value of the duty cycle of the PWM so the servo rotates between -90 to 90 and specifying which value was desired. The wheel’s PWM channels worked by using a control algorithm along with the rotary encoder within the motors. Basically, the control algorithm works by saving the past state of the angle the wheel has spun and compares it with the current angle along with the velocity by which it is spinning. These current and past states are compared along with a reference velocity and turn rate in a coupled PI controller. The IR sensor data then controls the turn rate along with the reference velocity the robot should have.
For the sharpshooter portion, we put a webcam on the front of our robot and plugged that into our Orange Pi. We used the blob_search library for the color targeting portion of the project. When the camera sees a significantly sized pink target, the Orange Pi will take in the frames and will send coordinates of the centroid of the pink area to the LaunchPad. To make it easier to integrate our separate parts of the projects together, we used case structures in our code. The main case structure would run the wall-following logic and is what the robot will do if there is no target that it sees with its camera. Once a target was seen, the code goes into the other case structures in which the robot rotates until it’s in line with the target. Then, depending on the height of the centroid of the pink target and the area of that pink target, the cannon is tilted on the Z-axis and the other servo motor pulls on the trigger to fire. The two servos had to be tuned so that the robot could reliably shoot at the height of the target and the cannon would reliably go off. Lastly, the servos will reset and the case structure will go back to the wall-following logic once again. If the robot stops to aim when it sees a target it will also return to wall-following after a specified amount of time if the target is not seens again.
Comments