I built this self balancing robot as a way to continue with my journey into robotics. During highschool, I was a member of my school's FIRST Robotics Team. There I had the role as a builder, where I built and maintained all mechanical components of our teams robot. Although I worked alongside our teams CAD designer and Programmer, I never got to work directly with the robots hardware, software, or CAD.
My next experience with robotics was during my freshman year of Undergrad at Rice University. There I joined Rice Robotics' Battlebots club as a way to gain some experience building a robot all on my own. The club gave me full control of my robots design, as I was able to CAD the frame, weapons, and wheels on my own. The electrical components, however, were already predetermined, and I was told exactly what to use and how to wire it all up. Furthermore, there was no software involved in this robot, as the speed controllers used took care of all the logic for us.
All of this led to me wanting to build a robot entirely on my own. I wanted an intermediate level robotics project that forced me to consider mechanical, electrical, and software components. Throughout all the different projects I found online, I determined that a self-balancing robot would be the perfect project for me.
Project OverviewThis two-wheeled robot on operates on the inverted pendulum principle. This essentially means that the robot is naturally unstable and wants to fall over. However, the two NEMA17 actuators attached to the wheels move in such a way that the robots position self corrects whenever it begins to tilt over. A pocket beagle microcontroller runs a PID (Proportional - Integral - Derivative) Algorithm that tells the motors how to spin based on the speed and direction of the robots tilt.
The speed and direction of the robots tilt is found through the use of the IMU BNO055. The BNO055 is a 9-axis Inertial Measurement Unit (IMU) that combines an accelerometer, gyroscope, and magnetometer with an onboard sensor-fusion processor. It outputs absolute orientation data (Euler angles or quaternions) directly, eliminating the need for complex fusion algorithms on the host processor. This makes it ideal for applications such as self-balancing robots, robotics navigation, and motion tracking.
AssemblyEvery non electrical component can be 3D printed using a standard Prusa printer. All parts are printed with PLA except for the tires, which were printed with TPU. Once all the components have been printed, assembly is fairly straightforward:
1). Soldering
The a4988 stepper motor drivers and the IMU BNO055 may require that you solder the pins yourself. All that is needed is a basic soldering kit and a breadboard to help with aligning the pins.
2). PocketBeagle
A PocketBeagle was used as the robots microcontroller. AWS was used to program a flashed micro SD card in the python language. Below is an image of the pocketbeagles pinout, which will be important for wiring up the different components of this project.
3). A4988 Stepper Drivers and NEMA 17 Motors
Identify the two motor coils and connect one coil to 1A and 1B
- Connect one coil to 1A and 1B
- Connect the other coil to 2A and 2B
The A4988 requires a separate motor power supply.
- VMOT → External motor supply positive (typically 12V)
- GND (VMOT side) → Motor supply ground
Place a 100 µF electrolytic capacitor between VMOT and GND close to the driver to protect against voltage spikes.
The PocketBeagle uses 3.3 V logic, which is compatible with the A4988.
- VDD → PocketBeagle 3.3 V
- GND → PocketBeagle GND
Use any available GPIO pins on the PocketBeagle.
- STEP → PocketBeagle GPIO
- DIR → PocketBeagle GPIO
Each pulse on the STEP pin advances the motor one step.
Before running the motor, adjust the current limit on the A4988. This prevents overheating and missed steps. Check your drivers datasheet and follow an online tutorial to set the appropiate current limit of 1.5A.
Repeat these steps for the second motor.
4). IMU BNO055
The IMU BNO055 is how the robot knows the speed and direction of its tilt. The IMU can send this information to the pocketbeagle microcontroller allowing the beagle to control the speed and direction at which the motors spin based on an appropiate PID algorithm. Below is the
The BNO055 communicates with the PocketBeagle using I²C and operates a 3.3 V logic.
Connect the following pins:
- VIN / 3.3V → PocketBeagle 3.3 V
- GND → PocketBeagle GND
The BNO055 uses I²C for data communication:
- SDA → PocketBeagle I²C SDA
- SCL → PocketBeagle I²C SCL
Ensure the PocketBeagle and BNO055 share a common ground.Once wired, enable the PocketBeagle’s I²C bus in software and verify communication before reading orientation data.
SoftwareAs previously mentioned, a PID Control Algorithm was used for the Self-Balancing Robot
A self-balancing robot uses a PID (Proportional–Integral–Derivative) controller to keep the robot upright by adjusting motor speed and direction based on the tilt angle feedback from the IMU BNO055.
Define the control variable
- Setpoint → Desired tilt angle (typically 0°, upright)
- Measured Value → Current tilt angle from the IMU
- Error → Setpoint − Measured Value
The PID controller computes a correction signal using three terms:
- Proportional (P) → Reacts to the current error Large tilt → large correction
- Integral (I) → Reacts to accumulated past error Corrects long-term drift and bias
- Derivative (D) → Reacts to the rate of change of error Dampens oscillations and improves stability
The control output is calculated as:
Output = Kp·Error + Ki·∫Error dt + Kd·(dError/dt)
Properly tuned PID control allows the robot to continuously correct its balance and remain upright during motion.
To access the actual python file as well as learn more about the code, please refer to the Github repository linked below:
Demo VideoFuture ImplementationsAs you may have noticed, I did not speak on either the LCD display or the Ultrasonic Sensor. This is because I was unfortunately unable to implement them into my project on time. However, they would have been used to allow the robot to enter into an autonomous stage where it would passively wander around. The robot would avoid bumping into obstacles through the use of the ultrasonic sensor. Furthermore, the LCD display could be used to display different facial expressions based on the robots current state. Such as a smile when it is upright and a frown when it has been knocked over. These are all features that could certainly be implemented in the future to add an additional layer of complexity to the overall project.




















_3u05Tpwasz.png?auto=compress%2Cformat&w=40&h=40&fit=fillmax&bg=fff&dpr=2)
Comments