Thursday, 9 August 2018

Little Critter v1.0

So, today after wrapping up the work-work, I decided that I was waaaaaay overdue digging out the Arduino's and Raspberry Pi's that I've got stuffed into various drawers.

To be honest, I'd forgotten 1/2 the stuff I've bought over the past X period of time.  Turns out, I've got quite a few "spare parts" that actually can be put together to make a "whole".

So I thought, hey, why not - it's been a while since I built the ArcTuRus (higher consciousness / sub consciousness - dual brained) robot, that is still sitting on the shelf (over there -----> )

This time though, I thought I'd just make a start at something simple, ie. get some navigation sorted out or movement via some motorised wheels and controlled by a motor-controller board, then use a usb mic to listen for commands and a usb speaker for shouting abuse at me and shoe-horn a camera in so the little critter can "see" the world around it and who knows, probably use openCV or something to then start to detect objects and react accordingly.

Naturally, at this point I start drifting off with grand plans of hooking up to REST APIs, Machine Learning services, mapping services to map out the rooms for navigation, etc..etc... and that's usually where I then tend not to actually get around to building the base model, or I get all the bits together and then something else comes up and I don't get around to the grander vision or even the basic one.

This time though, I'm going to do it.  Yeah, I'm going to spend the next 2 weeks in London, staying in a hotel every night, but the good thing is: I can take all the bits I need with me!!!  sorted.

Expect me to have something built from a skeleton hardware perspective and then software will be cobbled together and when I'm happy, I'll decide whether I'm going to then port all that into a K9 or an R2D2 casing that I have sitting on the other shelf (over there <--------).

I'll keep you updated.  Let's see where this goes.

Who know's you might see me on kickstarter in 2019  :-D

Talking of which, this just gave me a little bit of inspiration:

Vector is powered by a 1.2 GHz quad-core Qualcomm Snapdragon, has a wide-angle HD camera, single-point laser for mapping and navigation, four-mic array, and capacitive touch sensors. And while its speech capabilities rely on the cloud, other functions, like detecting if there are people nearby, use a convolutional neural network running on its on-board processor.

check out this article to find out more

No comments:

Post a Comment