Human-robot interaction (HRI) is shifting toward natural language interfaces powered by large-language models (LLMs). LLMs allow users to give intuitive, high-level commands instead of complex programming. Our team's objective is to improve user experience by simplifying communication with robots and making HRI more accessible for a broader range of users. Therefore, our project aims to enable robot control via natural-language commands by applying the ChatGPT OpenAI API.
There are two primary modes. The first mode is motion planning, where the user can provide a starting point, an ending point, and obstacles they want the robot to avoid. The second mode is custom, where the user can provide a specific path they want the robot to follow, such as a shape. After the user provides their commands and the LLM generates a path, the specific waypoints are parsed and stored in a CSV file. Following this, LabVIEW looks for the CSV file with the provided file name, and this data is sent over the robot's F28379D launchpad to execute the desired trajectory. Transmission Control Protocol (TCP) communication was enabled to support continuous input of user commands to the LLM, allowing for rendering of new paths and motion of the robot without having to restart the entire system. Furthermore, when the waypoint data is sent to the robot, it is done in groups of flag values and three points at a time to allow for any arbitrary number of total waypoints.
Once this information is received by the launchpad, the position control can be executed. The position control is achieved by first applying inverse kinematics to compute the desired steering angle. Next, the error between the desired steering angle and bearing (Figure 1) was computed. The robot was programmed to turn right or left until this error value was approximately 0.5 degrees. Several experiments were done to determine the ideal threshold value of 0.5 degrees. Our goal was to ensure that the angle error was not too small that the robot would overshoot and continuously turn, and that it was also not too large that the robot would not reach the desired ending point. After the specified angle error of 0.5 degrees was attained, the robot will move towards the (x, y) position of interest while checking its current location every 4 milliseconds. This control logic repeats until the robot has gone to all specified waypoints.
Collision detection was implemented as well; if the robot was less than 200 millimeters away from an object or wall, it would stop its motion.
Video 1 provides additional detail on the sensors used to execute position and collision control: https://drive.google.com/file/d/1DWA-_gmSIKdq5NaS8PT7DvZ0-LxchyYt/view?usp=drive_link.
The complete process is summarized in Figure 2.
Mode I: Motion Planning
An example of a prompt and the trajectory generated by the LLM (Figure 3) is below:
"Generate a 2D motion-planning trajectory. Map size: 3 × 3 meters. Start: (0, 0). Goal: (1.5, 1.5). Obstacles: obstacles = [(1, 1), (0.5, 0.5), ]. Using Greedy Search Path planning algorithm. Must start from positive X direction to the second point. Please generate exactly 6 waypoints (including start and end).Must be the points number I want. The trajectory must avoid all obstacles with ≥0.6 m safety distance. Points should be different. Output ONLY CSV: x, y (no header, no explanation)."
Video 2 demonstrates the robot following the above path: https://drive.google.com/file/d/1NM-Du00_HhI8mabA3atKgax7jdmfMSNa/view?usp=sharing.
Mode II: Custom
An example of a prompt and the trajectory generated by the LLM (Figure 4) is below:
"Generate exactly 18 waypoints such that: 1. The first waypoint must be exactly (0.00, 0.00). 2. The remaining waypoints must lie on the circle with center {(0, 0)} and radius {0.5}. 3. The second waypoint must be the point on the circle closest to the origin. 4. All remaining circle points must be evenly spaced along the circle. 5. Points should be different. Format all numbers to exactly two decimals. Output ONLY CSV: x, y (no header)."
Video 3 demonstrates the robot following the above path: https://drive.google.com/file/d/1LdelW8QiVZEgtKQDTXVnD_8KetAHSvSP/view?usp=sharing.






Comments