Categories
Mechanical Robotics

Wheel Hubs and other Modifications for my Kaya

Just a very short blog on some wheel hubs I designed for my Kaya. Following the build of my Jetbot, I wanted to experiment a bit with the Intel RealSense Depth Camera D435 in combination with the Nvidia Jetson Nano board, and decided to up the ante and build a Kaya robot as well.

This has been one of the more challenging projects I’ve done so far. I really enjoyed printing the very nicely designed plastic parts, but found it hard to source all the components as specified on the Kaya Github. In the end I managed to source all with the exception of the VEX 3.25″ Omni-Directional Wheel with SKU part number 217-4775 and the Terminal Block from McMaster with associated Terminal Block Jumpers.

After some more searching on the Internet, I found VEX 3.25″ Omni-Directional Wheel with SKU part number 276-3526. They have the same outside diameter, but have a slightly different arrangement of the rollers and have a different hub design. I was not able to find the Terminal Block for sale in Europe. Hence, I had to design and print custom wheel hubs and a look-a-like terminal block. You can download the design of the wheel hubs on Thingiverse if you’re running into the same issue as I was running into. I will add the design of the Terminal Block as soon as possible to Thingiverse as well.

As you can see from below pictures, the wheel hubs fit nicely on the servo motors and on the omni-wheels.

In below picture you can see how I resolved the issue of the unfindable terminal block, replacing it with a custom designed terminal block:

Custom Terminal Block for Nvidia Kaya
Custom designed terminal block on Kaya.

One more thing before I end this really short blog: I ran into trouble with powering up Kaya. Almost every time I switched the power on, the power light on the Jetson Nano board would light up only to turn off again after a few seconds. Really annoying, and it took me a some days to figure out what went wrong. Powering up using the power adapter worked fine, powering up using an USB cable worked too, but powering up over the batteries did only work in approximately 50% of the times.

Only once I changed out the 2.1 x 5.5mm Male Barrel Plug Pigtail, with a Male Barrel Plug Wire Adapter with thicker (larger diameter) wire, did I manage to resolved this. The Kaya robot now powers on without exception when on battery power.

Thick wire replacement of thin wire pig tail.

Although short, I do hope you will find this blog useful when you build your own Kaya or are running into the same issues as I ran into when building. Please do let me know if this has helped you somewhat or your missing information.

Categories
Sensors

Ultrasonic Sensor for the Jetbot

The Nvidia Jetbot offers an excellent opportunity to explore the Nvidia Jetson Nano, do some nice 3D printing and learn AI at the same time.

I managed to get the Jetbot to recognize persons and follow them along using AI in no time, but I ran into difficulty trying the Jetbot to learn about it’s environment. I could not get the Jetbot to learn the difference between the floor and the wall no matter what I tried. That took away a bit of the fun for me and thus I decided to add an ‘old fashioned’ ultrasonic sensor. I attached it to the front of the Jetbot, using a 3D printed interface piece and combined it with a Adafruit M0 feather which I stuck to the bottom and interfaced to the Jetson Nano via I2C.

In this blog I will explain how to do this yourself so you may try out if this is useful for you. The end result for me looks like this:

Jetbot with ultrasonic sensor attached to the front.

For this project you will need following:

  • A complete and functioning Jetbot. If you want to build one yourself (which I hightly recommend), full instructions can be found on the Nvidia Jetbot Github,
  • The ultrasonic sensor (I use a HY-SRF05, but I guess a HC-SR04 would do as well),
  • The 3D printed front. You will need to print the front for attaching the ultrasonic sensor to the Jetbot yourself. You can download the STL for this part from Thingiverse.
  • A small microcontroller (Adafruit Feather M0) to control the ultrasonic sensor in realtime, avoid timing (scheduling issues) with the Jetson Nano,
  • A set of short female and male headers to piggy back the Adafruit DC Motor FeatherWing of the Jetbot,
  • Two 10K resistors to convert the response from the sensor from 5V to a safe 2.5V,
  • A few microscrews to fix the sensor and the 3D printed front,
  • Some multicolored (jumper) wires.

Start with printing of the front and soldering on the short female headers to the Feather M0. The female headers should be soldered facing upwards (i.e. the headers and components are on the same side). Next, we’ll wire up the ultrasonic sensor to the Feather and run a test to check proper function of the sensor and the feather. Note that the SRF05 sensor requires 5V power and the Feather runs on 3.3V. While the Trig signal can be 3V or 5V, the ‘return’ Echo signal is 5V logic. As the Feather can only handle 3.3V, use the two resistors as a divider to convert the 5V logic level to a safe 2.5V. Be careful with wiring up to prevent damage to the Feather.

  • Connect the SRF05 GND to a GND pin of the Feather M0,
  • Connect the second GND pin of the Feather M0 to a Jetson Nano GND pin,
  • Connect the SRF05 VCC to the 5V power pin of the Jetson Nano,
  • Connect the SRF04 TRIG pin to pin 11 of the Feather M0,
  • Connect the SRF04 ECHO via a divider bridge (using the two resistors) to pin 12 of the Feather M0.

If done correctly, your test assembly should resemble something like the breadboard lay-out as shown.

Breadboard lay-out for ultrasonic sensor and feather M0 testing

Start up the Jetson Nano so that 5V is provided to the sensor, hook up a USB cable between the feather and your computer and upload the source below using the Arduino IDE.

/*
Ultrasonic sensor for Jetbot
----------------------------
Trigger on port 11
Respons on port 12
5V to 3.3V response conversion with 2.2K and 4.7K resistors
I2C slave sender on port 8
----------------------------
This sketch is based on the Ping))) Sensor sketch created by David A. Mellis on 3 Nov 2008, modified by Tom Igoe on 30 Aug 2011 and modified by Ron in 2020 to 
include I2C communications.
----------------------------
The original source of Ping))) can be found on:
  http://www.arduino.cc/en/Tutorial/Ping
This example code is in the public domain.
*/

#include <Wire.h>

// this constant won't change. 
// It's the pin number of the sensor's output:
const int pingPin = 11;
const int distPin = 12;
long cm;

void setup() {
  // initialize serial communication:
  Serial.begin(9600);
  pinMode(pingPin, OUTPUT);
  pinMode(distPin, INPUT);
  Wire.begin(8);
  Wire.onRequest(requestEvent);
}

void loop() {
  // establish variables for duration of the ping, 
  // and the distance result in inches and centimeters:
  long duration, inches;
  // The PING))) is triggered by a HIGH pulse of 2 
  // or more microseconds.
  // Give a short LOW pulse beforehand to ensure a clean 
  // HIGH pulse:
  digitalWrite(pingPin, LOW);
  delayMicroseconds(2);
  digitalWrite(pingPin, HIGH);
  delayMicroseconds(5);
  digitalWrite(pingPin, LOW);
  // A HIGH pulse whose duration is the time (in 
  // microseconds) from the sending of the ping
  // to the reception of its echo off of an object.
  duration = pulseIn(distPin, HIGH);
  // convert the time into a distance
  // inches = microsecondsToInches(duration);
  cm = microsecondsToCentimeters(duration);
  // Print distance to serial monitor for testing
  Serial.print(cm);
  Serial.print("cm");
  Serial.println();
  delay(100);
}

// I2C function that executes whenever data is requested 
// by master. This function is registered as an event, 
// see setup()
void requestEvent() {
  byte buf[4];
  buf[0] = (byte) cm;
  buf[1] = (byte) cm>>8;
  buf[2] = (byte) cm>>16;
  buf[3] = (byte) cm>>24;
  Wire.write(buf, 4); // respond with message of 4 bytes 
                      // (cm is long type)
  Serial.println("I2C request");
}

long microsecondsToInches(long microseconds) {
  // According to Parallax's datasheet for the PING))), 
  // there are 73.746
  // microseconds per inch (i.e. sound travels at 1130 
  // feet per second).
  // This gives the distance travelled by the ping, 
  // outbound and return,
  // so we divide by 2 to get the distance of the 
  // obstacle.
  // See: http://www.parallax.com/dl/docs/prod/acc/28015-PING-v1.3.pdf
  return microseconds / 74 / 2;
}

long microsecondsToCentimeters(long microseconds) {
  // The speed of sound is 340 m/s or 29 microseconds 
  // per centimeter.
  // The ping travels out and back, so to find the 
  // distance of the object we take half of the distance 
  // travelled.
  return microseconds / 29 / 2;
}

When uploading is completed, open the Arduino Monitor. Hold your hand in front of the sensor and the distance between the sensor and your hand should be displayed in the monitor. Move your hand closer and further away from the ultrasonic sensor and note the changing readout on the monitor. Once you have this functioning, we’re ready for the next step; interfacing with the Jetson Nano using I2C.

Switch off power on both Jetson Nano and the Feather M0. Now temporarily disconnect the I2C connection to the motor driver from pins 3 and 5 of the Jetson Nano (remember their positions!) and make an I2C connection between the Jetson Nano and the Feather M0. You do this by connecting the SCL pin of the Feather M0 with the SCL pin of the Jetson Nano (pin 5) and connecting the SDA pin of the Feather M0 with the SDA pin of the Jetson Nano (pin 3). Now power up again and once the Jetbot software has loaded, connect to the JupyterLab interface on your computer and create a new python session. Copy and paste the following short Python test script to JupyterLab:

import smbus
import time

bus = smbus.SMBus(1)
address = 8 # I2C address for Ultrasonic bus

def distance():
    """Returns distance measured by ultrasonics in cm"""
    return bus.read_byte_data(address, 0)

while True:
    print(distance(), " cm")
    time.sleep(1.0)

Start the script by pressing shift-enter in JupyterLab and you should see the result of a distance measurement every second. Try holding and moving your hand at several distances from the ultrasonic sensor and note that the reported distance changes in the JupyterLab output. If everything works as described, you now have a working distance measurement system communicating with the Jetbot over I2C.

All we need to do now is to integrate the ultrasonic sensor and the Feather M0 microcontroller with the Jetbot. Start with unscrewing the DC motor driver Feather Wing from the Jetbot and disconnecting the jumper wires from the Feather M0 microcontroller. Desolder the 2-pin and 3-pin headers on the motor driver, move these headers to the inner rows of the DC motor driver board same pins and resolder. Solder an additional 2-pin header to pins 11 and 12 of the DC motor driver board using the inner rows, and solder a 1-pin header to the second GND of the DC motor driver board. Next, solder the new short male headers strips to the bottom of the DC motor driver board. Position the M0 Feather on the location of the motor driver and secure in place using the micro screws that where used to hold the motor driver in position.

Fasten the ultrasonic sensor to the 3D printed front using two micro screws. Connect the jumper wires to the sensor if required and then fasten the 3D printed front to the Jetbot using two micro screws in the top holes of the 3D printed front.

Now fix the motor driver Feather Wing in place on top of the M0 Feather by inserting the short male headers into the short female headers and re-attach the jumper wires in their original positions. Connect the ECHO and TRIG pins of the ultrasonic sensor to the 2-pin header (pins 11 and 12) of the motor driver Feather Wing, and connect the GND pin of the ultrasonic sensor to the second GND single header pin on the motor driver. Route the 5V jumper wire from the sensor to the 5V pin of the Jetson Nano. Reconnect the I2C wires you disconnected from the Jetson Nano earlier. Once finished, it should resemble the picture below. Note that I had to bend the 2-pin and 3-pin headers 90 degrees to allow sufficient ground clearance. Tidy up the jumper wires as good as you can and check whether it resembles (or looks better!) than the picture below. Note the I2C wires at bottom right of the Feather Wing, the TRIG and ECHO wires of the sensor in the middle right and the GND wire of the sensor at the bottom left.

Motor Driver Feather Wing piggy-backed on M0 Feather and all jumper cables attached.

Check all your wiring one last time and power up the Jetbot, open the JupyterLab environment on your computer and try following Python Script to prove the integration has been successful:

from jetbot import Robot
import smbus
import time

robot = Robot()
bus = smbus.SMBus(1)
address = 8 # I2C address for Ultrasonic bus

def distance():
    """Returns distance measured by ultrasonics in cm"""
    return bus.read_byte_data(address, 0)

while True:
    if (distance() < 20):
       robot.backward(0.4)
       time.sleep(0.6)
       robot.left(0.4)
       time.sleep(0.5)
    else:
       robot.forward(0.4)

If all’s well, your Jetbot should now be cruising along the floor and once it encounters a wall it will back up a little bit, make a left turn and will then continue it’s journey. You will probably note that this will only work correctly when the Jetbot is more or less perpendicular to the wall. When the Jetbot is moving at shallow angles (almost in parallel) with the wall, the ultrasonic beams will bounce of the wall away from the Jetbot and the Jetbot will not pick up a return signal from the sensor. But for me, this solution works for moving around the room than trying to get the AI system to recognize my walls. I find it also a more efficient method (in terms of computing power and power consumption) for basic navigation in a room. I hope you enjoyed my blog and perhaps even managed to replicate my little experient as well. I would love to have your feedback and thoughts on this subject.

Categories
IoT MQTT

Secure MQTT transmissions

I have used the MQTT protocol quite a lot over the years for domestic as well as industrial applications and have forever been struggling with secure message transmissions.

Magnetic Puck shaped Arduino Nano 33 IoT IMU Sensor

My usual solution for this has been to include a frame counter in the payload and then apply a light encryption on the payload. On the receiving side I would then decrypt the payload, check for an increase of the frame counter and then accept the payload. Of course, this only encryped the payload and other parts of the MQTT message still remained visible during it’s travel from my node to my server. I found that anything more than that was just not possible with nodes I developed using microcontrollers such as the ARM Cortex M0.

I started looking for new opportunities when LoRaWan nodes were being devoped that did come with on-board encryption to allow secure data transmissions. Surely somebody would come up with a nice microcontroller with a crypto chip onboard soon I thought…

… And so they did. Some months ago, Arduino marketed their Arduino Nano 33 IoT microdevice that includes secure communication through an on board the Microchip® ECC608 crypto chip. Finally I am able to communicate via a secure communication channel between node and server using Transport Layer Security (TLS) and Secure Sockets Layer (SSL). Time for an experiment.

Let’s assume a case where you want to check the cycle time of your machine and monitor the number of products made per hour. You want to use Industry 4.0 technology without interfering with (or needing to interface with) your existing Scada or other MES system you may have installed. Let’s also assume you want to do this is a secure manner using your existing (WPA2 secured) WiFi already installed in your machine shop. And just for good measure, you want to prevent others to intercept the message stream and / or spoof your system with false messages.

In this blog, I will set up a simple server with an MQTT broker and Node Red. I will discuss a simple Node-Red script to interface with the broker, handle the incoming messages and display a dashboard. Protecting your server and your Node-Red environment is outside the scope of this blog, but needless to say also very important. I will also show you how to prepare an 3D printed enclosure for the node complete with two small strong magnets for interfacing with the moving metal machine parts without modifications to these parts.

First thing is setting up a MQTT broker on your server with TLS and SSL enabled to provide a secure communication channel between a node (client) and the server. I use RabbitMQ for industrial applications, but for domestic use and quick testing I find Mosquitto easier to setup. There are multiple locations on the Internet that can provide you with good instructions to help you along setting up your MQTT broker. The instructions from HiveMQ are a good starting point for setting up Mosquitto with TLS and SSL.

You can already start with printing your enclosure. STL files can be found on thingiverse.

Categories
Robotics Stepper Motors

A drawing robot

Right smack in the middle, my first blog. Not about what I’ve done in the past, not about what I want to do in the near future. None of that. Just a first blog on what’s been on my mind over the last few weeks: how do stepper motors work and how can I effectively drive them.

So here is what started this all:

Robot Arm
Robotic Arm

This is my Robotic Arm, 3D-printed on my very own 3D printer, courtesy of Zortrax. It is really nice looking, and although axes 4 and 5 are not actuated, great for experiments and getting acquainted with robotics. Should you want to print one yourself, you can find the design here. The design was already placed on Zortrax website in 2015, but has held up great over the last five years. (I may write a blog on building this robotic arm later. Lot’s to learn there too).

Now, the people of Zortrax recommend the use of NEMA 17 Stepper Motors, a RAMPS board with three A4988 stepper drivers, an Arduino Mega with Marlin Firmware, Pronterface as control system, and a 12V minimum 100W power supply. As it so happens, I have those parts waiting for me in my now obsolete Prusa Mendel 3D printer that I build eight years ago. If you don’t have an old Prusa Mendel waiting for you, you could of course check out Amazon or equivalent for a reasonably prized kit containing all required items apart from the stepper motors. If you were planning on shopping on Amazon you could pick up three of these for starters. As a power supply, this one would be good to start with, although we will be discussing more on power supplies later on in the blog.

RAMPS
Arduino Mega with RAMPS and stepper drivers attached.

The sample code that Zortrax provides is a short G-code that allows the robot arms to move (dance actually), but in my case, it works only when I hold the arms in a vertical position during the start. If I do not, the arms do not lift or will suddenly drop. It is not yet as in the movies.

Dancing Zortrax Robotic Arms by Trinity3D

Although I only see the axis 2 robotic arm in near vertical position in the video clip as well, I can not hold the position of my robotic arms when horizontal (or 45 degrees for that matter). Somehow, my robot lacks sufficient power to lift it’s arms from horizontal to vertical.

My robot arm cannot lift it’s own weight.

So, time to start tinkering. Do I need a larger power supply, bigger stepper motors, stronger stepper drivers, or another control system?

And thus, after one month, several blown up stepper motor driver boards, a blown up power supply and some locked up stepper motors, I think I might have found a solution:

Very strong drivers

And combined with a large 24V power supply, it looks like this:

Large stepper motor drivers with 24V power Supply
Large stepper motor drivers with 24V Power Supply

The drive system consists of three TB6600 4A 9-42V stepper motor driver controllers set to 8 microsteps and 3.5A current per drive controller. My stepper motors are rated for 1.5A current per winding, and with two windings in one stepper motor should not receive more than 3.0A in total. However, I found that 3.0A current is not sufficient to lift the robot arm from a horizontal position to a vertical position. At 3.5A current (and at 24V voltage) I can, and although the stepper motors do get very hot, it is working quite well for me. If you want to replicate the same, please do so at your own risk and be careful, 3.5A at 24V is already quite dangerous and you might be overloading the stepper motors. I found both driver controllers and switching power supply on Amazon:

As you can see from the picture above, I made a small frame to secure the three driver controllers to the power supply. I used 10×10 mm MakerBeams for this. I find them very easy to use, if you like to try them out, just check out their website at https://www.makerbeam.com/. To prove that my hard work has paid out, below a short video impression of my robot arm introducing itself to you.

Introducing my robot arm

My complete setup (on my way to messy work bench) now looks like below. You will note that I have replaced the gripper with a permanent marker. This to achieve my final goal: A robot that can draw.

Ready to Draw!

Drawing Robot

The good people of Zortrax have indeed designed and shared a fine robot arm system, using as much as possible of the shelf 3D-printing hardware and software. With a 12V power supply, the RAMPS board, the Marlin Firmware on the Arduino Mega combined with the Pronterface software on your computer it is indeed possible to activate the robot arm using G-codes. However, I found that this set-up has some limitation for me. For one, it is not strong enough to lift the arm from a horizontal position to a vertical position. And next to that, I find it very complicated to move the arm to a specific position using G-codes that actuate the stepper motors with an interface designed for 3-D printers and not so much for robotic arms.

I now solved the power issue, switching to 24V and large 3.5A drivers, and the arm can lift itself, but how do I move the gripper from one position to the next without having to fiddle around with G-codes driving and controlling each stepper motor individually. Wouldn’t it be nice if I could just tell the arm to move the gripper from one (X, Y, Z) position to the next? It’s time to freshen up my math skills. I figure I need two steps to transform a giving (X, Y, Z) position of the gripper to the numbers of steps for all three stepper motors.

To make live a bit easier, I will focus on the position of the wrist joint instead of to the non-controllable gripper. First thing I need to do is to transform the cartesian coordinate (X, Y, Z) of the wrist to spherical coordinates. The math is as follows:

From this formula, we now know the horizontal angle of the shoulder joint (θ), and also know the length from the centre axis of the shoulder joint to the centre axis of the wrist (r). What’s left is to calculate the vertical angle of the shoulder joint and the angle of the elbow joint. We can do this by making a triangle calculation given r and the fixed lengths of the upper arm and the lower arm. From https://www.calculator.net/triangle-calculator.html we can learn that the angles (A, B, and C) can be calculated using below formulas given the known lengths of sides a, b, and c.

For the vertical angle of the shoulder joint we need to combine both ø from the cartesian to spherical transformation and the angle calculated from the triangle calculation to obtain the angle with the horizontal plane. The function in C that can do this (with some minor error checking included) looks like below:

// global variables
double AA1_new;   // horizontal angle of R
double AA2_new;   // vertical angle of R
double AA3_new;   // upper arm angle relative to R
double AA4_new;   // lower arm angle relative to upper arm
double LL1 = 160; // upper arm length
double LL2 = 340; // lower arm length
/ min max angles
double AAO3max = 3.15000;  // angle lower arm
double AAO4max = 3.15000;  // angle upper arm
ddouble RR2_min = 115.0; // minimum possible radius
double RR2_max = LL1 + LL2; // maximum possible radius

boolean carth_to_spher(double X, double Y, double Z) {
   RR2_new = sqrt(X * X + Y * Y + Z * Z);
   AA1_new = atan(Y / X);
   AA2_new = asin(Z / R);
   // check if carth coordinate fits in spherical range of robot arm
   if (RR2_new > RR2_max) {
     Serial.println("Max R exceeded");
     return(false);
   } else if (RR2_new < RR2_min) {
     Serial.println("Min R exceeded");
     return(false);
   } else {
     AA3_new = PI - (AA2_new + acos((LL1 * LL1 + RR2_new * RR2_new - LL2 * LL2) / (2 * LL1 * RR2_new)));
     AA4_new = acos((LL2 * LL2 + LL1 * LL1 - RR2_new * RR2_new) / (2 * LL2 * LL1));
     if (AA3_new >= AAO3min && AA3_new <= AAO3max && AA4_new >= AAO4min && AA4_new <= AAO4max) {
       return(true);
     }
     else {
      Serial.println("Angles exceeded");
      return(false);
     }
   }
}

To calculate from angles to steps and then activate the stepper motors, you can use below function:

void move_to_carth(double X, double Y, double Z) {
   if (carth_to_spher(X, Y, Z)) {
     new_position_d0 = steps1 / 3.14 * (AA1_new - AAO1);  // absolute steps to new position shoulder
     new_position_d1 = steps2 / 3.14 * (AA3_new - AAO3);  // absolute steps to new position upper arm
     new_position_d2 = steps3 / 3.14 * (AA4_new - AAO4);  // absolute steps to new position lower arm
     new_position_l0 = (long) new_position_d0;
     new_position_l1 = (long) new_position_d1;
     new_position_l2 = (long) new_position_d2;
     new_position[0] = new_position_l0;
     new_position[1] = new_position_l1;
     new_position[2] = new_position_l2;
   // activate steppers to go to new_position[n] - current_position[n]
     steppers.moveTo(new_position);
     steppers.runSpeedToPosition(); // Blocks until all are in position  
   } else {
     Serial.println("Could not calculate position");
   }
}

My aim was to develop a robot arm that would be able to draw a picture of some sorts on a flat surface. You can find my code (combining above functions) to draw a circle within a rectangle in a vertical plane below:

// Include the AccelStepper library:
#include <AccelStepper.h>
#include <MultiStepper.h>

// Define stepper motor connections and motor interface type. Motor interface type must be set to 1 when using a driver:
#define dirPinLowerArm 2
#define stepPinLowerArm 3
#define dirPinUpperArm 4
#define stepPinUpperArm 5
#define dirPinBase 8
#define stepPinBase 9
#define motorInterfaceType 1

// coordinates of wrist relative to shoulder
double XX2;
double YY2;
double ZZ2;
double RR2;
double XX2_new;
double YY2_new;
double ZZ2_new;
double RR2_new;
double AA1_new;   // horizontal angle of R
double AA2_new;   // vertical angle of R
double AA3_new;   // upper arm angle relative to R
double AA4_new;   // lower arm angle relative to upper arm

double LL1 = 160; // upper arm length
double LL2 = 340; // lower arm length

// start angles at stepper 0 positions
double AAO1 = 0.0000;  // angle shoulder at start
double AAO3 = 0.0392;  // angle lower arm at start
double AAO4 = 0.6211;  // angle upper arm at start

// min max angles
double AAO3min = 0.0387;  // angle lower arm
double AAO4min = 0.5233;  // angle upper arm
double AAO3max = 3.15000;  // angle lower arm
double AAO4max = 3.15000;  // angle upper arm
double RR2_min = 115.0; // minimum possible radius
double RR2_max = LL1 + LL2; // maximum possible radius

int steps1 = 7000; // number of steps for 180 degrees rotation shoulder
int steps2 = 4600; // number of steps for 180 degrees rotation upper arm
int steps3 = 3600; // number of steps for 180 degrees rotation lower arm

double new_position_d0;
double new_position_d1;
double new_position_d2;
long new_position_l0;
long new_position_l1;
long new_position_l2;
long new_position[3];
double new_position_d[3];

// Create a new instance of the AccelStepper class:
AccelStepper stepperUpperArm = AccelStepper(motorInterfaceType, stepPinUpperArm, dirPinUpperArm);
AccelStepper stepperBase = AccelStepper(motorInterfaceType, stepPinBase, dirPinBase);
AccelStepper stepperLowerArm = AccelStepper(motorInterfaceType, stepPinLowerArm, dirPinLowerArm);
MultiStepper steppers;

void setup() {
   // put your setup code here, to run once:
   Serial.begin(9600);

  // Set the maximum speed and acceleration:
  stepperLowerArm.setMaxSpeed(400);
  stepperLowerArm.setAcceleration(25);
  stepperLowerArm.setPinsInverted(true);
  stepperUpperArm.setMaxSpeed(400);
  stepperUpperArm.setAcceleration(25);
  stepperBase.setMaxSpeed(400);
  stepperBase.setAcceleration(25);

  // add steppers to multistepper
  steppers.addStepper(stepperBase);
  steppers.addStepper(stepperUpperArm);
  steppers.addStepper(stepperLowerArm);
 
   // reset position of stepper motors
   new_position[0] = 0;
   new_position[1] = 0;
   new_position[2] = 0;
 
   // move to starting position
   XX2_new = 350;
   YY2_new = cos(0) * 50;
   ZZ2_new = 125 + sin(0) * 50;
   move_to_carth(XX2_new, YY2_new, ZZ2_new);
   delay(2000);

    // draw a circle with R = 50, in X-Z plane maintaining Y
   for (int i=0;i<721;i++) {
     XX2_new = 350;
     YY2_new = cos(PI * i / 360) * 50;
     ZZ2_new = 125 + sin(PI * i / 360) * 50;
     move_to_carth(XX2_new, YY2_new, ZZ2_new);
   }
   XX2_new = 350;
   YY2_new = 50;
   ZZ2_new = 75;
   move_to_carth(XX2_new, YY2_new, ZZ2_new);
   XX2_new = 350;
   YY2_new = -50;
   ZZ2_new = 75;
   move_to_carth(XX2_new, YY2_new, ZZ2_new);
   XX2_new = 350;
   YY2_new = -50;
   ZZ2_new = 175;
   move_to_carth(XX2_new, YY2_new, ZZ2_new);
   XX2_new = 350;
   YY2_new = 50;
   ZZ2_new = 175;
   move_to_carth(XX2_new, YY2_new, ZZ2_new);
   XX2_new = 350;
   YY2_new = 50;
   ZZ2_new = 75;
   move_to_carth(XX2_new, YY2_new, ZZ2_new);

   delay(5000);

   // move to origin
   new_position[0] = 0;
   new_position[1] = 0;
   new_position[2] = 0;
   steppers.moveTo(new_position);
   steppers.runSpeedToPosition(); // Blocks until all are in position

}

void loop() {
   // put your main code here, to run repeatedly:
   // no code in loop for now
}

void move_to_carth(double X, double Y, double Z) {
   if (carth_to_spher(X, Y, Z)) {
     new_position_d0 = steps1 / 3.14 * (AA1_new - AAO1);  // absolute steps to new position shoulder
     new_position_d1 = steps2 / 3.14 * (AA3_new - AAO3);  // absolute steps to new position upper arm
     new_position_d2 = steps3 / 3.14 * (AA4_new - AAO4);  // absolute steps to new position lower arm
     new_position_l0 = (long) new_position_d0;
     new_position_l1 = (long) new_position_d1;
     new_position_l2 = (long) new_position_d2;
     new_position[0] = new_position_l0;
     new_position[1] = new_position_l1;
     new_position[2] = new_position_l2;
   // activate steppers to go to new_position[n] - current_position[n]
     steppers.moveTo(new_position);
     steppers.runSpeedToPosition(); // Blocks until all are in position  
   } else {
     Serial.println("Could not calculate position");
   }
}

boolean carth_to_spher(double X, double Y, double Z) {
   RR2_new = sqrt(X * X + Y * Y + Z * Z);
   AA1_new = atan(Y / X);
   AA2_new = asin(Z / RR2_new);
   // check if carth coordinate fits in spherical range of robot arm
   if (RR2_new > RR2_max) {
     Serial.println("Max R exceeded");
     return(false);
   } else if (RR2_new < RR2_min) {
     Serial.println("Min R exceeded");
     return(false);
   } else {
     AA3_new = PI - (AA2_new + acos((LL1 * LL1 + RR2_new * RR2_new - LL2 * LL2) / (2 * LL1 * RR2_new)));
     AA4_new = acos((LL2 * LL2 + LL1 * LL1 - RR2_new * RR2_new) / (2 * LL2 * LL1));
     if (AA3_new >= AAO3min && AA3_new <= AAO3max && AA4_new >= AAO4min && AA4_new <= AAO4max) {
       return(true);
     }
     else {
      Serial.println("Angles exceeded");
      return(false);
     }
   }
}

And the result?

Robot’s first drawing…

Not bad for a first try, right? …well, could be improved a bit perhaps… Thanks for reading anyway, and please do let me know what you think of my experiment or have any questions on this subject.