Found a robot arm in the garage (as you do!)

As you do on a Saturday, I found an old Maplin (who remembers them, before they stopped trading!?) robot arm with a USB connector.


I was looking around online to purchase a new one and of course I got carried away and moved from £50, to £250 to £400 devices - I even got carried away and looked at the £7000 versions, but alas - I am being money conscious nowadays and thought, well, if I have this old one, SURELY I can make it work.

I had a quick google and indeed the robot arm was from Maplin.  I unscrewed the casing and found the 4 x D batteries inside that had an "expiry date of Mar 2019".  That tells you more than enough.  This unit had gotten quite dusty knocking around in the garage.  I "probably" did something with it back in 2016-2018 ish timeframe, I cannot recall. As I've mentioned, I kind of got distracted by the Corporate world around then and lost my focus - fret not, it is returning!

I noticed that it has the USB connected and adapter on the back - again, more googling and I see that it did come with a CD-ROM and the software was for Windows ONLY.

Sigh.

I was going to leave it as an object d'art on the bookcase and go and spend that hard earned cash, when I thought, "No, I will find something on the good old inter-webs".

It was NOT easy - but indeed, I did find something, that led me to something else, that led me to the GOLD.



There was the GOLD - a link to this website:

https://notbrainsurgery.livejournal.com/38622.html



That had some C code and the usage of a libusb C library:


Turns out that link it dead, BUT the library still exists, it's just moved to here:

https://libusb.info/



I've ordered new 4 x D batteries, I actually couldn't get a better price than Amazon delivery - I know, I know, but hey - it gives me time to get the software working before they arrive tomorrow :-D

I would prefer to get the C code running on a raspberry pi, which should be simple enough and then I can hook that up to take commands from remote OR even just hook it up with "other things" to make it a bit more autonomous.  Oooooo, I just remembered I have a couple of the Raspberry Pi Camera modules knocking around that I found recently too.

Okay, this could end up being a bit Frankensteins monster, but it'll be a fun use of time and re-invigorate that creative urge that Corporate World (and social media) etc... has drained me out of.

Which also reminds me, this is an interesting video to watch - if you can, give it a watch, this is basically how I've felt now for 3-4 years and I acted upon it about 2 years ago and I am now starting to feel myself again (yes, I deleted my facebook / instagram apps from my phone & I feel so much better talking to myself in my own head! I know that sounds wrong, but it feels so right.  give it a try.


Will update with progress.

UPDATE:

Here's the C code as a starter:

#include <stdlib.h>

#include <stdio.h>

#include <sys/types.h>

#include <string.h>

#include <libusb-1.0/libusb.h>

#define EP_INTR (1 | LIBUSB_ENDPOINT_IN)

#define ARM_VENDOR       0x1267

#define ARM_PRODUCT      0

#define CMD_DATALEN      3


libusb_device * find_arm(libusb_device **devs)

{

libusb_device *dev;

int i = 0;

while ((dev = devs[i++]) != NULL) {

struct libusb_device_descriptor desc;

int r = libusb_get_device_descriptor(dev, &desc);

if (r < 0) {

fprintf(stderr, "failed to get device descriptor");

return NULL;

}

if(desc.idVendor == ARM_VENDOR &&

   desc.idProduct == ARM_PRODUCT)

  {

    return dev;

  }

}

return NULL;

}


int main(int ac, char **av)

{

    if(ac!=4)

    {

        fprintf(stderr,"Usage: armedgetest CMD0 CMD1 CMD2\n");

        return 1;

    }

    unsigned char cmd[3];

    cmd[0]=(unsigned char)strtol(av[1],NULL,16);

    cmd[1]=(unsigned char)strtol(av[2],NULL,16);

    cmd[2]=(unsigned char)strtol(av[3],NULL,16);

    libusb_device **devs;

    libusb_device *dev;

    struct libusb_device_handle *devh = NULL;

int r;

ssize_t cnt;

r = libusb_init(NULL);

if (r < 0)

    {

    fprintf(stderr, "failed to initialize libusb\n");

    return r;

    }


    libusb_set_debug(NULL,2);

    cnt = libusb_get_device_list(NULL, &devs);

    if (cnt < 0)

return (int) cnt;

    dev=find_arm(devs);

    if(!dev)

    {

    fprintf(stderr, "Robot Arm not found\n");

    return -1;

    }


    r = libusb_open(dev,&devh);

    if(r!=0)

    {

    fprintf(stderr, "Error opening device\n");

            libusb_free_device_list(devs, 1);

             libusb_exit(NULL);

    return -1;

    }

    fprintf(stderr, "Sending %02X %02X %02X\n",

            (int)cmd[0],

            (int)cmd[1],

            (int)cmd[2]

    );

    int actual_length=-1;    

    r = libusb_control_transfer(devh,

                                0x40, //uint8_t bmRequestType,

                                6, //uint8_t bRequest,

                                0x100, //uint16_t wValue,

                                0,//uint16_t wIndex,

                                cmd,

                                CMD_DATALEN,

                                0  

    );    

    if(!(r == 0 && actual_length >= CMD_DATALEN))

    {

        fprintf(stderr, "Write err %d. len=%d\n",r,actual_length);

    }

    libusb_close(devh);

    libusb_free_device_list(devs, 1);

    libusb_exit(NULL);

    fprintf(stderr, "Done\n");

return 0;

}


When compiling this, it throws this error:

 gcc armedgetest.c -o armedgetest

/usr/bin/ld: /tmp/ccRoDrnJ.o: in function `find_arm':

armedgetest.c:(.text+0x37): undefined reference to `libusb_get_device_descriptor'

/usr/bin/ld: /tmp/ccRoDrnJ.o: in function `main':

armedgetest.c:(.text+0x18b): undefined reference to `libusb_init'

/usr/bin/ld: armedgetest.c:(.text+0x1cb): undefined reference to `libusb_set_debug'

/usr/bin/ld: armedgetest.c:(.text+0x1dc): undefined reference to `libusb_get_device_list'

/usr/bin/ld: armedgetest.c:(.text+0x244): undefined reference to `libusb_open'

/usr/bin/ld: armedgetest.c:(.text+0x27e): undefined reference to `libusb_free_device_list'

/usr/bin/ld: armedgetest.c:(.text+0x288): undefined reference to `libusb_exit'

/usr/bin/ld: armedgetest.c:(.text+0x2f8): undefined reference to `libusb_control_transfer'

/usr/bin/ld: armedgetest.c:(.text+0x338): undefined reference to `libusb_close'

/usr/bin/ld: armedgetest.c:(.text+0x349): undefined reference to `libusb_free_device_list'

/usr/bin/ld: armedgetest.c:(.text+0x353): undefined reference to `libusb_exit'

collect2: error: ld returned 1 exit status


It seems that Ubuntu installs the library however, you have to make sure you reference it when compiling:

(networkx) tony@tony-magicbook:~/dev/robot/arm$ gcc armedgetest.c -o armedgetest -lusb-1.0


and there we go, it compiled:

(networkx) tony@tony-magicbook:~/dev/robot/arm$ ls -l

total 24

-rwxrwxr-x 1 tony tony 16640 Jan 31 23:30 armedgetest

-rw-rw-r-- 1 tony tony  2468 Jan 31 23:17 armedgetest.c


It can now be executed:

(networkx) tony@tony-magicbook:~/dev/robot/arm$ ./armedgetest 00 00 01

Robot Arm not found


however, I am still waiting for those batteries to arrive before I can test this working.


Update to follow.


To answer the question - "what can you do with a robot arm?", here's a few ideas:





and this is ALSO why I've purchased a JETSON ORIN NANO


along with the stereo cameras:

Now, combine that tech together and basically you get "more" than the £400 robot arm, for a lot cheaper but also re-cycling the existing kit and code that I had knocking around already.


Update to follow.

Well, that was a bust! I got the batteries, fitted them. flipped the off/on switch to on.....and nothing.
hmmmm...out with the voltage tester. yep, it has 6 volts (1.5+1.5+1.5+1.5), the switch is okay, but it looks like the usb board is dead, or one of the four custom ICs on the back of the board is dead.
Ah well, it was worth spending the £10 on batteries, just to find it doesn't work.

However, as the servo's etc... are only 2 wires plugged into the usb board, I'm pretty sure that I can plug those into the analog pins of a Raspberry Pi and control them. I guess that the usb board would "protect" the servo's so you cannot go too far either way, so I'll have to keep this in mind.  If I fry them, then it's time for the bin.  So, maybe the arm gets to live for a bit longer?

We'll see.


UPDATE:

Well, as you know, I cannot get bored too quickly, so whilst I dropped the usage of the robot arm, I fired up the RPi4 Freenove 4WD car robot thingy.  Basically, it has a bunch of sensors on it along with some wheels, so I stacked up come things underneath it, so the wheels are not touching the table (yep, it's like I've done this before!?!)
Then I stumbled over this kids great enhancement for using this device: https://github.com/Kiamehr5/RaspberryPi-AutoCar
kudos to him, he's an 11year old who took the base code and tweaked it for his own needs.  awesome shout out to him.

I had to do a few things to get it to work.
sudo apt install python-opencv
(yes, I could have done it with the pip approach, but hey-ho)
then
pip install ultralytics

got an odd error in ref to the picamera2 library. turns out that one of the above installed numpy 2 and it needed to use an older 1 version. To solve that was as simple as:
pip install numpy==1.26.4
it did the uninstall and installed this version.
Then a quick test by running python camera.py - yay! that is the image shown inside the image below:


The important part is - as I'm mis-using the car, basically just testing the very old camera module on the RPi 4 device, you can see that the car is actually quite some distance away.
However, the Ultralytics YOLO image analysis that was performed (output on the left inside the image above), did prove VERY GOOD!
I did pick up the bottle and hide it, which is why is is not picked up.
It was also interesting that the number of "persons" changed, this was due to imagery on the TV that involved a person and I wonder if it was accurate enough to spot the Buddha on the fireplace as a person?
I'll create a preview output with bounding boxes to show me what it is "seeing".
However, seeing as I literally did NOTHING, I'm pretty impressed that this worked as well as it did.

It would be interesting to see how much faster and more accurate this would work on an RPi5 and a newer version of YOLO.  This is pretty outstanding tbh, considering where this technology was when I first got the car (with RPi4 and camera), the software has come along in leaps and bounds.


Comments

Popular posts from this blog

Google Opal [experiment] - not in the UK yet

Weaviate Verba RAG with Node-Red & Ollama Engine

Remote Working Data Annotation options - Digital Nomad