Posts

PWAs...hang on, did time stand still and then loop back on itself?

Image
Back in the day, I purchased one of these phones.  ( Wow! was it really in 2013?...I suppose it must have been )   https://en.wikipedia.org/wiki/ZTE_Open The cool thing about it at the time was, it wasn't Android, Apple or Microsoft.  It had it's own OS and had the potential to investigate "other options" for apps.  As it was backed by Firefox / Mozilla, it made sense that it was driven by the Web and more specifically, Web apps that run in web browsers.  Hey, that suited me fine, s'what I've been mostly doing since, well, since, for far too long. Whilst I have no issue, when the need requires,  to code for a native Android java app, or flip my head the other way and code in Swift for an iOS device, it did kind of irk me that I had to start following the code religion camps again... it also conflicted that I wasn't focused on just coding for the device itself, but having to code for the server-side APIs and usually a web app that also offered the...

Self parking slippers from Nissan

Image
original article HERE. At first glance, the ProPILOT Park Ryokan looks like any other traditional Japanese inn, or ryokan. Slippers are neatly lined up at the foyer, where guests remove their shoes. Tatami rooms are furnished with low tables and floor cushions for sitting. What sets this ryokan apart is that the slippers, tables and cushions are rigged with a special version of Nissan's ProPILOT Park autonomous parking technology. When not in use, they automatically return to their designated spots at the push of a button. For its primary application,  Nissan's ProPILOT Park system  uses an array of four cameras and twelve sonar sensors to wedge its host vehicle into even the smallest of parking spaces—whether it's nose-in parking, butt-in parking, or trickiest of all, parallel parking. It seems unlikely that the slippers use quite the same technology, although Nissan does suggest that the technology is at least similar, which would mean that the slippers are opera...

NAO robot and IBM Watson services

Image
You know I like Robots. You know I like IBM Watson API services. Checkout how to get those 2 working together - there's a full description of everything needed in the github repo readme.md CHECKOUT THE GITHUB REPO (now) (watch the YoutTube replay of the Live cast HERE ) I'm going to take this as a baseline and instead of using the NAO, I'm going to use my T1ll13 robot-head. oh, btw - I've made some really good 3D printing success and building - roll on the weekend when I can get those web-cam eyes hooked up and working too!  

Using machine learning to spur human creativity is so 2018

Image
You may recall my Digital Music Tangent post a little while back?....  It may have sounded a little odd and off-the-wall, but it's funny how things sometimes connect up in the universe. Welcome to the IBM Watson bEAT.   You can use this open-source software to create rich compositions from simple input melodies.  That's right, if you can hum it (and record it), you can get Watson to then take these elements and combine to make some non-man made music. Check it out and run it on your own laptop and get creative: https://github.com/cognitive-catalyst/watson-beat Using machine learning to spur human creativity is so 2018! Watson has used a combination of reinforcement learning and neural networks to create the Watson Beat. The first album "Space" is actually very melodic and has replaced my usual "meditation music" that I usually listen to when I start the working day. ( Full album on YouTube here ) Full length original article is HERE ...

3D Printing is making some 2018 progress

Image
T1ll13 is starting to manifest into reality...... (btw - those are nose supports that I need to hack off with my dremel!) Now, just on a mission to print out all the internal parts to make up the mechanisms for the movements to happen..... ..and yes, it's all in PLA as that is what works for me and this printer.  ABS is just a waste of time and effort at the moment, it rarely sticks to the bed (been through the rounds of increasing bed temp) and when it does, after a few cm's it starts to split...so I decided rather than messing around, I'll just do it initially all in PLA and then come back and print again in ABS - probably once I've got myself a glass bed in place (or a different 3D printer!) I think we might be 3D printing for a few more weeks yet...as there are 10-13 parts needed for the skull and then..and then....and then.... you get the idea :-)

Using ROS to control GPIO pins on a Raspberry Pi 3

Image
As the title says, it was time for me to document how to do the "simple" task of using ROS (Robot Operating System) to turn an LED on/off (well, we can do a LOT more than that by using the GPIO pins, but as a baseline, it's a starter). Because I've installed ROS onto the Raspberry Pi itself, I can create ROS nodes directly on the RPi. So let's get on with it and create a simple demo to blink an LED using ROS topics from the RPi. First step, we have to download " wiringPi " $ git clone git://git.drogon.net/wiringPi $ cd wiringPi $ sudo ./build The interesting thing is that everyone decides how they are going to refer to the GPIO pins on the RPi ...I just keep track of the physical pin (as that doesn't change!) And here is the GPIO pin layout in relation to wiringPi: https://projects.drogon.net/raspberry-pi/wiringpi/pins/ Now that we've installed the library we're going to be using, let's switch to creating a ROS package...

3D Printer arrives

Image
After much hassle and delays and patience.....the 3D Printer arrived. Well, yes, it arrived on a Sunday afternoon.  Whilst I was out.  At Cheddar Gorge, because, no-one is going to deliver anything on a Sunday afternoon are they? it's safe to go out and get some cheese.... nope.  I came back home and found this box on the doorstep.  Yes, it was very wet and soaking up the water.  I pushed it into the porch.... I smiled and thought of positive things, which as it turns out was the right thing to do. The box had a box inside the box, so the 3D printer was well protected. I did the classic "sweep everything off the dining table" manoeuvre and set about putting it together: I stuck the SD Card in the side and selected a file that ended in .gcode - I had absolutely no idea what it was, but I just wanted to test everything was okay: After running through the warm up and levelling the print bed, it looked like it was time to go: Initially I didn't...

T1ll13 robot step 2.1

Image
Whilst I await on a very slow delivery of a 3D Printer ( yes, I decided to go ahead and buy one and give it a go ), I decided to switch back to the software side of things. I decided that I need to do some work with the XBox Kinect and ROS. After a bit of googling around, I see that as I've decided to use the ROS "kinetic" version and not the "Indigo" version that everyone else has previously used ( that's the older version, btw ), I'd be figuring this out for myself and who knows, I might even help some other people out along the way. I got a bit distracted and it looked like I needed to setup the robot simulator software Apparently I need to install MoveIt! - so, time to fire up the Raspi3, drop to a Terminal and type: $ sudo apt-get install ros-kinetic-moveit $ source /opt/ros/kinetic/setup.bash ( and then !boom! 2hrs of power-cuts just hit my area, probably something to do with the snow, etc... ) http://docs.ros.org/kinetic/api/mov...