After being properly impressed by the performances of both Boston Dynamics' Spot and Unitree's Go2 robotic dogs, I've started taking and interest in designing or building and coding one of my own. Since I also got into 3D printing I naturally looked for models that could be 3D printed without breaking the bank. Quite quickly I zeroed in on variants of SG/MG90S servos quadruped robots but also had a go at MG996R models which turned out to be quite larger than the smaller MG90S. So while trying to figure out how to build one of my own I scouted the internet for previous takes by other people. Not surprisingly there are many, many (some might even say too many) projects of this kind. Trying to duplicate one such project usually led me to get stuck either on the 3D printed parts not coming out like in the original or the servos or other hardware parts I used did not quite match what the project used so after a couple of failed attempts to duplicate a project, I decided to take matters into my own hands.
So, starting off from the OpenCat project and its 3D printed parts I designed the parts for a 3DOF leg for the MG90S. Having previously tried with SG90S using nylon gears that broke after a short use under load or really cheap MG90S clones with "half metal" gears that snapped when you least expected, I decided to only use "full metal" MG90S so I bought a bucket load (around 50 pieces) of full metal MG90S servos so I can have lots of spares. Since my aim was to have a 12 DOF robot I had to get some PCA9685 PWM Servo driver boards. Initially I was focused on giving the robot some sort of ranging capability using HC SR-04 ultrasound sensor but after a few robot iterations it became obvious that the bot should have wireless connectivity for control and video feed to be of any actual use. So, I turned from ESP8266 which makes a wonderful platform for WiFi+MQTT projects to using ESP32-WROOM-CAM microcontroller. I shamelessly repurposed some web server camera code I got off from the internet and other robot kits I purchased over time and adapted it to my own use case.
I eventually ended up with a quad bot controlled by an ESP32-WROOM-CAM "brain", its 12 MG90S full metal servos hooked into the PCA9685 PWM servo driver, a HC SR-04 sonar for ranging and lastly a MPU5060 IMU for stabilization, but this last feature I'm still working on as I can read the angles of the robot but still working on how to react to changes.
The most difficult part was the Inverse Kinematics bit of how to get the robot to move. Naturally I've gone through many tutorials from other folks who've designed and built their quad bots but, as I mentioned, my "monkey see-monkey do" approach did not quite work and the code I had access to I was not really able to understand its bits, even though I do understand the trigonometry approach in solving the IK, getting past that and applying that to my quad bots eluded me for a while.
After a bit of deliberation I decided to let go of my old approach of controlling the servos which was in fact a Direct Kinematic approach that worked to some extent but not really in the way I was seeing the IK bots walking. So, I took the trigonometry equations and solved them "my way" and then applied to the 3D printed 3DOF leg assembly I had and coded a small program in Processing/Control P5 to compute the angles of each joint correctly.
With the model working in the simulated app, I had to test it on an actual bot and since I had quite a few of quad bots sitting idle around the house, I took upon rewriting the code for them to use the IK method of controlling the servo's angles.
Several days and many more cussing later, I had an ESP32-WROOM-CAM with the PCA9685 hooked into the I2C bus, 12 MG90S servos nicely hooked in the PWM driver and a web page served by the HTTP server running on the ESP32 where I was giving the quad commands to move a given leg's leg tip by a given amount.
Now that the IK seemed to work I had to turn to moving that leg tip. Some more head scratching and "torments of creation" and I had functions to move a leg by a given amount on a specific axis, move all legs at once and so forth. This actually proved to be quite useful as moving all legs at once with the same speed for a specific length was something I wanted to use in my approach to get the bot to walk.
Next came the trial and error part of identifying gait patterns and how to implement them.
Obviously I did not want to invent the wheel...again...but the information I did find about gait patterns didn't quite fit with how I was planning on getting my bot to walk.
My idea was to have on the UI buttons you click to tell the bot to take one step, which later on I modified so you can tell it to go one "tick" of a step in a gait pattern. So, now I had to devise gait patterns and code them.
The easiest one I expected to be walk or crawl which I saw as moving one leg at a time up->forward->down->back or up&forward->down->back for a quicker gait, but this led to an issue with the bots not walking straight since the one leg doing the moving wasn't providing support anymore and when doing the back movement, the remaining three legs was actually dragging on the ground and resisting.
This sucked as I did not really have a working gait pattern.
Going back to the drawing board aka searching for already done implementations, I came across a trot pattern description which seemed to make sense: two diagonal legs lift and shift forward while the other two diagonal legs stay on the ground. After returning the travelling legs down to their new points touching the ground, the other pair of diagonal legs did the same. But here's where the problems creeped in as the legs moving up and forward than down were no longer supporting the body of the robot in the same way so the Center of Gravity of the whole bot was no longer in a stable location. To add insult to the injury, I had to use a pair of 5A BECs to supply 5V power to the PCA9685 and the IMU and the other BEC to the ESP32-WROOM-CAM and HC SR-04. This had the unpleasant disadvantage that I had to use a 3S LiPo pack to step down from 12.6V and provide enough amps for the hungry servos and the demanding ESP32 web cam server that also acted as WiFi client, IK engine and whatnot. The 3S LiPo pack I repurposed from one of my quadcopters however proved to be quite heavy and I had to get a bit creative with attaching it to the body. The downside was that the whole robot assembly wasn't stable on just three legs. I had to really swivel the legs to keep the CG from landing outside of the triangle composed by the remaining three legs touching the ground. This made any for of gait unfeasible.
My next thought was to try and reposition the lifted leg to its destination as quickly as possible to avoid having the quad bot's body tumble down.
So my trot gait pattern turned out to be lifting and shifting forward two diagonal legs while the remaining legs on the ground moved backwards sliding the body of the robot forward. Next came lowering the lifted legs down to the ground to restore stability, then do the same thing for the other pair of diagonal legs.
To my surprise, this trotting gait pattern proved to be quite effective and the bot moved pretty fast.
I had to tweak the step length and leg movement speed to get the bot to move in a straight line, basically making sure it falls by the same amount when stepping with both legs. Currently with the body configuration I have, a leg move step of 25 mm and speed of 0.5 mm/s should produce a straight trajectory.
I've attached the code and FreeCAD files for the leg assembly if anyone would be interested in having a go at it but this is not a how-to guide or a step-by-step set of instructions on how to build and assemble it. Neither is the code I wrote, it works but there is no detailed explanation of how or what each function does. Hopefully they're fairly self-explanatory, if not then ask me and I'll try to clarify.














Comments