In the second part of this project, we will continue with the portion of the project that should be much easier – programming. When programming a hexapod, there are usually two different approaches. The first one is to just figure out a sequence of servo movements that will make the robot walk forward. Obviously, this is hard work that doesn’t really pay off – the code you produce can’t be easily ported to another hexapod. That is why there’s the second approach called inverse kinematics. But first, we need to upgrade our on-board microcontroller.
Adafruit 16-channel PWM Shield (or module; however, the shield is strongly recommended because it has a small prototyping area)
12 micro servos with metal gears (MG90S or equivalent)
4.8 or 6 V Battery (NiMH, Li-ion etc.)
60 M3 bolts +120 nuts and washers (just for the body, you’ll need some extra to mount the components)
6 identical ball pen springs
HC-SR04 Ultrasonic ranging module (Optional)
Github – you can find all the Arduino source codes and 3D models for printing here
UNO -> Mega
In the previous article, Arduino UNO was recommended to use for this project. However, I encountered an issue with UNO: it simply doesn’t have enough SRAM memory to do all the calculations required for the inverse kinematics model to work. Most of these calculations are done with floating-point numbers. Each of these numbers will take 4 bytes of memory when it’s being used. That is 2 times as much as integer numbers. While it might not seem like much, UNO only has 2 kB of RAM, some of which will be taken by global variables. If we reserve 0.5 kB for all the globals and other local variables, we will be left with 1.5 kB of free memory, and it will only take 384 float numbers to waste all that. 384 might seem like a lot, but it is not enough for the amount of data the IK model will produce (read the “The Algorithm” section below to find out why), so we have to get more memory somehow.
The easiest way to achieve this is to change the UNO to MEGA. MEGA and UNO are compatible, so there will be no changes to the schematics. Plus, not only we will get four times more RAM for all the calculations, it will also mean eight times more flash memory to store our programs. We most likely won’t use all of that, but it is always nice to have a reserve space. Below are the updated Fritzing schematics, but in case you have the latest revision of Arduino MEGA (Rev 3), the exchange is as simple as disconnecting UNO and connecting MEGA. For reference, see the schematics below.
Figure 1. Updated schematic for the PWM shield
Figure 2. Updated schematic for the PWM module and HC-SR04 ultrasonic sensor
Now we can take a look at a little bit of physics, a lot of maths and a tiny bit of code.
Intro into Inverse Kinematics
Some of you may remember that in high school, there was a part of physics called “kinematics”. This is an area of mechanics that – simply put – describes the motion of an object (or point). This means that in kinematics, you analyze the known motion of a single point using mathematical equations and models. As the name suggests, inverse kinematics (IK) does the exact opposite: from a set of mathematical equations, we recreate the original movement.
In robotics, what we typically have is an algorithm that can calculate movements of all joints based only on the desired endpoint movement. Now you can clearly see the advantages of inverse kinematics over hard-coding the servo movements – it is universal. Just one algorithm should, in theory, handle any movement the robot is capable of. From the user’s point of view, it’s also very easy to use – you just tell the robot to turn 90° to the left, then walk one meter, for example. You don’t need to worry about the position of each servo.
In the above paragraphs, one word just keeps coming back: a (mathematical) model. While this sounds really difficult, in the case of this hexapod, the model is really simple: any position the robot is mechanically capable of is defined by a set of seven points. One for the body, other six for the legs. If you take a look inside the AP_Utils library (available on GitHub), specifically inside AP_Utils.h, you will see (among other things) the definitions of these points:
You can see them being declared inside AP_Utils class as private structures
There are two reasons for these structures to be private:
The user should not be able to change these values at will. They exist to keep track of the current position of the robot, therefore the only time they should be changed is when the robot is actually moving. If the user decides to, for example, change the current z coordinate of origin, it would lead to unpredictable changes in the IK model – which is obviously undesirable.
It’s generally a good programming practice (especially in C++) to keep the amount of public functions and variables of a class at an absolute minimum needed for interaction outside the class. This is to improve security and to allow easy API implementation.
If we want, we can visualize these points. Our entire robot is now represented by seven points (Figure 3).
Figure 3. Graphical representation of the IK model. Simply put, this is what the hexapod “thinks” he looks like. The red dot is the body, the blue dots are legs.
These structures are used to keep track of the position of all the legs, as well as the robot itself. You might notice that the position of legs is defined only by two coordinates: phi and z. This is because each leg only has two degrees of freedom and can therefore only move in two axes. Any possible position is now defined by the x, y and z coordinates of the body.The phi and z coordinates of each leg range from -1 to 1 and only define the position of the leg itself relative to the body. While this might seem like an unnecessary complication right now, it is actually much easier than to calculate x, y and z of each leg after each movement. The phi coordinate represents horizontal movement and z represents vertical.
Figure 4. Leg detail with the phi and z axes
We now have a simple enough mathematical representation of the robot, but we haven’t done anything with it quite yet. Next step is to figure out how to move servos using only changes to this model. What we need to achieve is a program that will take the input as a set of point coordinates and translates it into servo movement.
This is when another major roadblock appears, and this time, simply swapping for another Arduino won’t do. When moving the servos, we will most likely have to move more of them at once. However, Arduino (and all AVRs for that matter) can only perform one task at a time – if we want to move the servos smoothly, we would have to move them one after the other. If we just move the servos from one extreme position to the opposite, the entire thing will be very unstable.
One way around this problem is to calculate the positions of all the servos beforehand – then iterate through these tables and set all the servos. Because Arduino MEGA clock frequency is 16 MHz, all the servos appear to be completing one fluent motion – even though what is actually going on is that they are moving in small increments. This is the same effect videos use to create an illusion of continuous movement, even though they are just a collection of still images. A human brain simply can’t process visual information that quickly. If we add a 50-millisecond delay after each position change, it becomes clear that the servo movement is actually made up of small increments.
This is also why we had to change the Arduino. In case we want to move every servo, we will need a lot of memory to store the coordinates we just calculated. If we move all 12 servos, we will need 600 float numbers to store the movement coordinates, because each of them will require at least 50 positions to create an illusion of smooth movement. 600 floats is about 2.3 kB of RAM – which is already more than UNO has.
In the AP_Utils library, the actual functions that translate the positions into servo movements are traceLeg() and setLegs() functions. traceLeg() will only do the calculation: when supplied with phi and z coordinates of the desired end point, it will create a path in the form of an array of coordinates. The paths can have various shapes, currently supported are linear (a simple line from one point to another), circular arc and elliptic arc. The latter two are used to make walking easier. The second function – setLegs() – will actually move all the given legs according to the results from traceLeg. However, all of this is hidden from the regular user. The point of this entire approach is to be as much user-friendly as possible. The end user will never be required to setLegs() directly, he or she will only have to call functions directly related to walking.
That brings us to the final part of the IK programming – actually walking. We now have covered all the basics – we have created a model to keep track of everything, we can move multiple servos and even make the movement somewhat smooth. In the following text, the numbers of the legs correspond with the numbers on the following image:
Figure 5. Leg numbering
This numbering system is also consistent with the code you can find in the library.
Let’s start with something easy: turning on the spot. Generally, when setting a new position for the legs, you always have to keep at least three of them on the ground. The reason behind that is obvious – unless you can move the legs extremely fast, the robot would lose stability and fall down. We can turn the robot in a few steps:
1. Move legs 0, 2 and 4 to their max phi coordinate (maximum horizontal angle).
Figure 6. Turning step 1 (Legs 0, 2 and 4)
2. Do the same for legs 1, 3 and 5.
Figure 7. Turning step 2 (Legs 1, 3 and 5)
3. Turn the body. This is done by moving all horizontal servos in the opposite direction they were moving in steps 1-3. Because all legs are on the ground and can’t move, the only thing that moves will be the body itself.
Figure 8. Turning step 3
With each repetition of these steps, the robot should turn 80°. We can, of course, turn less than that, simply by not going all the way to the max phi coordinate. Thanks to the clever algorithm behind the traceLeg() function, we don’t have to calculate any z coordinates for the legs – that will be done automatically to form the shape of a circular or elliptical arc. You can see this in the following video.
The final step is walking. Specifically, we want it to at least walk forward. There are many different types of walking algorithms for hexapod robots, but most of them rely on legs with 3 degrees of freedom. Ours only has 2, so we’ll have to improvise a little bit. The motion I came up with isn’t as fast it could be, but it’s the easiest one to code and it’s easy to see what is going on:
1. First, legs 0 and 3 moved forward
Figure 9. Walking step 1 (Legs 0 & 3)
2. Then, legs 2 and 5 are moved in the same direction
Figure 10. Walking step 2 (Legs 2 & 5)
3. Similar thing is done with legs 1 and 4.
Figure 11. Walking step 3 (Legs 1 & 4)
4. Now the body is moved forward and the process can be repeated.
Figure 12. Walking step 4
You can see the entire motion in the following video:
Basic Obstacle Avoidance
You might have noticed that a part of the library is dedicated to the SR04 ultrasonic rangefinder. This is to get some information about the environment the robot is moving in. Of course, one stationary sensor wouldn’t be nearly enough, so we mounted it on an extra servo in the previous article.
I’m sure most people attempting to build a hexapod robot are at least somewhat familiar with how ultrasonic rangefinders work. The function I recommend you to use to interface with this sensor is AP_Utils::sr04_median. It will provide the most accurate result of all the SR04 functions in the library. You can even change the units of the outputs, currently supported are millimeters, centimeters, and meters!
Important: Please note that you will need Adafruit PWM driver library for AP_Utils to work, you can download it here. After downloading, install it as any other Arduino library.
Below is an example of a very simple “autonomous” mode using everything we discussed so far: walking, turning and reading the distance from SR04. If you read the article carefully you should have no problem understanding what is going on at the most important parts of the code. For more detailed reference for all the functions, please refer to the README.md inside the library folder or on the GitHub page.
//if an obstacle is closer than 20 cm, we have to turn
//turn 90 degrees to the right
Congratulations on making it this far through one of the more challenging projects. Good job! Before you try to make your own Ardupod walk, be sure to run both the calibration.ino and servo_test.ino from the examples folder. These are critical to set all your servos right so they won’t break! In the next article, we may visit this project one final time to fix some mechanical weak points and more importantly, add some improvements like remote control.
Jan is currently studying Electrical Engineering at Brno University of Technology. He has many years of experience building projects using Arduino and other microcontrollers. His special interest lies in mechanical design of robotic systems.