This project's original goal was to build a remote-control car using Lego Technic. As I delved into various configurations I discovered the Raspberry Pi BuildHAT, which is a motor interface between Lego Powered Up motors and a Raspberry Pi. Version 1 of the Autonomous Auto had a HC-SR04 sonic distance sensor, which did not provide reliable distance data. The sound waves were absorbed by various materials and the errors caused many crashes. Also lack of vision did not provide adequate feedback in resolving logic issues.
Switching to Version 2, my first camera was the Pi AI camera, which integrated easily with Raspberry Pi Raspbian. I was wanting to add ROS2 as the controller, requiring a switch to Canonical Ubuntu 24. The Pi AI camera does not have a driver for Ubuntu, so I switched to a usb camera, the ArduCam B0433 12Mpixel. I added the VL53L5CX from Pololu as the distance sensor, which provides object detection for sixteen zones in its field of view.
Canonical Ubuntu is required to configure for 50 Hz cycle time.
Integrating these devices with ROS2 Jazzy was aided by chatGPT which had very good technical advice on the c++ and python code used for the ROS2 nodes. The target cycle time for effective responses is 50 Hz.
The video feed uses flask and opencv to generate the image and annotation. The Ubuntu 24 image is headless so I view the image in a browser.
I had an issue with cooling the Raspberry Pi. It was resolved by providing more space between the cooling fan and the BuildHAT.
The BuildHAT integration has been completed. I found a forced speed govenor of 50%. Setting the "plimit" switch to 1 removed the limitation in the buildHAT api and the motors are now running at their specified maximum.
I have added monitoring of the CPU and the motors as seen in the screen capture below.












Comments