3D printed robot head - object tracking

Object Tracking......in Python and C! wow, what a bonus...

After a little break of playing with some old PC equipment, I got back to the important task at hand.  I did digress, but not for long, which is a bit of a record for me.  Y'know, I think I just might be able to stick with this for 2021 now and "get some stuff done"...

I've even managed to persuade, Mrs.IsAGeek, to lend a hand...she's just bought herself a robot car platform that'll use one of the Raspberry Pi 3 devices, so she can learn about the hardware and then the software side of things....she'll soon overtake me on the C-coding front (as she's a secret guru at that and although probably a little rusty, it won't be long before it comes back to her).  Then, we'll migrate up to more and more sensors, add a few Arduino's to the mix, a few more servo's, a robot arm or two... and then some legs...and then I'm sure we'll be back to the question of whether to use ROS or not?

I'm hoping that we'll keep each other enthused enough to make time to "get some stuff done"....

Right, back to Object Tracking.

I found an article or two that showed the 7-8 different methods of tracking an object once it has been defined.  There are different pro's & con's to different solutions, however, on my Honor laptop and the Raspberry Pi, I could only ever get 1 or 2 of them to work (using the extra python-contrib-python library), so maybe there's something wrong that I've repeated twice - we'll see when I port it over to C.



So...the good news is, in code, it's using the same basis, you access a web-cam, you perform a read of a frame then allow the user to left-click and draw a "box" around the object you are interested in - namely my ugly mug, that'll be shown in blue.  Then press "Enter" and then the box will change to red/pink-ish and will then track the object moving.  Now, this doesn't care about whether it is my face or a cup of tea, it's just interested in recognising that object and keeping it in track.




As a little tester, the frames per second is displayed on the image.  Now running it on my 8-core Honor laptop I was getting about 10frames a second (not sure quite why it was so low?), but then running it on the 4-core Raspberry Pi 4, I was getting 5-10 fps, but, and this is the main thing the CPU average was around the 30% mark...  Now, this is the key point here.  This is what I was after achieving.  I wanted to identify a face, then just track that face and then use that tracking to move the servo's.

The next step for this code, apart from porting over to C, is to use the previous code for face recognition, once I've captured a face, then that's the same as the manual left-click and box sectioning, then we can do the tracking.  If the face is "lost", we'd have to re-do the face recognition / finding piece again until we do find one and then we go back to object tracking again.  I fully expect there to be spikes of CPU usage during these times, but I reckon that will give us exactly what I wanted to achieve.

Here's a video of the basic setup showing it working (Python code version):


...and true to my word, I ported it over to C code to achieve the same result.

Well, I say the same, I actually thought I'd do a little bit more with this one.  I decided that I would do a HAAR face recognition first, once detected then use the detection values for the Tracker co-ordinates and then track the object that way, the object being my face.

Just for debugging, I thought I'd leave the first detection frame on the screen so I could see the face was detected and then I create a second screen where I then do the tracking from.



I was playing around and I was trying to lose the tracker and therefore the detected face, if this happens I put up some words to state this and I then call the same code again to attempt to detect a face.

For now I have it so that if a face is NOT detected straight away, I just cancel out of the code, I've put in a comment that I need to actually put that through a loop and keep going until I do find a match...hey, a comment is a good start.... it means that I might actually get around to putting that in place, at some point.

Anywayz.....here's the hacky C code that was an evolution of the code before along with the new Tracker code:








and here is the obligatory video to go with it, I almost forgot to add the nmon output, here we go, I'll do it again - I'll also add in a little bit of failure too, so it shows the re-scan for a face:


As you can see..the CPU usage is still around the <30% range on the Raspberry Pi 4 even with the extra HAAR face detection happening in the code.  This is better.  This is much much better.

Right, tomorrow evening...we'll tap it into the servo code!  Now, where did I put that <wiring.h> file?


Comments