Posts

IBM's Watson-based voice assistant is coming to cars and smart homes

Image
https://www.engadget.com/2018/03/20/ibm-watson-assistant/ IBM's Watson-based voice assistant is coming to cars and smart homes This all sounds very familiar :-D http://tonyisageek.blogspot.co.uk/2016/12/ibm-bmw-and-iot.html ......... One of IBM's first partners Harman will demonstrate Watson Assistant at the event through a digital cockpit aboard a Maserati GranCabrio, though the companies didn't elaborate on what it can do. In fact, IBM already released a Watson-powered voice assistant for cybersecurity early last year. You'll be able to access Watson Assistant via text or voice, depending on the device and how IBM's partner decides to incorporate it. So, you'll definitely be using voice if it's a smart speaker, but you might be able to text commands to a home device. Speaking of commands, it wasn't just designed to follow them -- it was designed to learn from your actions and remember your preferences. If you allow the Watson-powered ap...

Automated harvesting by agricultural robots

Image
This is something close to my own heart and something I'd like to get more involved with in the future. https://spectrum.ieee.org/tech-talk/robotics/artificial-intelligence/ai-detects-papaya-ripeness I like the idea that the robot harvesters can use Visual Object detection and then Recognition to examine the fruit and determine if it needs picking - how it does that picking raises an eyebrow, but mixed with robotic arm and pressure sensitive 'finger' tips and some good coding it'd be feasible. Then, in the grocery shop as a customer, you can whip out your phone, scan the fruit on the shelf and it can "highlight" which fruit will ripen when, allowing you to get the best choice of fruit to meet your needs.  ie. do I want to eat it today? actually, I want to eat 3 of these in 3 days time, which are the best 3 to select? sounds all very cyberpunk....

Program in C

Image
Frank, I doff my cap to you Sir, awesome find and very relevant!

Elon, you send a car....IBM will send a disembodied head...

Meet CIMON: http://fortune.com/2018/02/28/ibm-cimon-robot/ Called CIMON (Crew Interactive Mobile Companion), the new crew member is about the size of a medicine ball and will work alongside human astronauts in space. The “floating brain” is equipped with IBM’s  Watson artificial intelligence technology  and is expected to assist astronauts during the European Space Agency’s Horizons mission in June.

T1ll13 robot step 2.2

Image
very minor update, but great fun along the way.... As previous shared, I setup an RPi3 with ROS and was about to move into the coding of ROS nodes .  Whilst that was okay...it involved me having to use Python and whilst I see it's really useful to use, it's not my native tongue.  That is JavaScript...has been client & server-side since about 1998, so it's my go-to comfort first choice. So rather than attempting to port everything in my head into Python and then into ROS nodes, I thought I would do what I needed to do: "to get it to work" and then I'd look at porting it over. What did that require?  Well, installing NodeJS and npm on the RPi3 for one.  Dead easy and simple to do. Oh, I forgot to say what it is/was I wanted to actually achieve!  okay, I want to be able to SHOUT at T1ll13 and for her to always be listening via a microphone ( no trigger "Alexa" or "Google" words for me ), to convert that Speech to Text, then s...

PWAs...hang on, did time stand still and then loop back on itself?

Image
Back in the day, I purchased one of these phones.  ( Wow! was it really in 2013?...I suppose it must have been )   https://en.wikipedia.org/wiki/ZTE_Open The cool thing about it at the time was, it wasn't Android, Apple or Microsoft.  It had it's own OS and had the potential to investigate "other options" for apps.  As it was backed by Firefox / Mozilla, it made sense that it was driven by the Web and more specifically, Web apps that run in web browsers.  Hey, that suited me fine, s'what I've been mostly doing since, well, since, for far too long. Whilst I have no issue, when the need requires,  to code for a native Android java app, or flip my head the other way and code in Swift for an iOS device, it did kind of irk me that I had to start following the code religion camps again... it also conflicted that I wasn't focused on just coding for the device itself, but having to code for the server-side APIs and usually a web app that also offered the...

Self parking slippers from Nissan

Image
original article HERE. At first glance, the ProPILOT Park Ryokan looks like any other traditional Japanese inn, or ryokan. Slippers are neatly lined up at the foyer, where guests remove their shoes. Tatami rooms are furnished with low tables and floor cushions for sitting. What sets this ryokan apart is that the slippers, tables and cushions are rigged with a special version of Nissan's ProPILOT Park autonomous parking technology. When not in use, they automatically return to their designated spots at the push of a button. For its primary application,  Nissan's ProPILOT Park system  uses an array of four cameras and twelve sonar sensors to wedge its host vehicle into even the smallest of parking spaces—whether it's nose-in parking, butt-in parking, or trickiest of all, parallel parking. It seems unlikely that the slippers use quite the same technology, although Nissan does suggest that the technology is at least similar, which would mean that the slippers are opera...

NAO robot and IBM Watson services

Image
You know I like Robots. You know I like IBM Watson API services. Checkout how to get those 2 working together - there's a full description of everything needed in the github repo readme.md CHECKOUT THE GITHUB REPO (now) (watch the YoutTube replay of the Live cast HERE ) I'm going to take this as a baseline and instead of using the NAO, I'm going to use my T1ll13 robot-head. oh, btw - I've made some really good 3D printing success and building - roll on the weekend when I can get those web-cam eyes hooked up and working too!  

Using machine learning to spur human creativity is so 2018

Image
You may recall my Digital Music Tangent post a little while back?....  It may have sounded a little odd and off-the-wall, but it's funny how things sometimes connect up in the universe. Welcome to the IBM Watson bEAT.   You can use this open-source software to create rich compositions from simple input melodies.  That's right, if you can hum it (and record it), you can get Watson to then take these elements and combine to make some non-man made music. Check it out and run it on your own laptop and get creative: https://github.com/cognitive-catalyst/watson-beat Using machine learning to spur human creativity is so 2018! Watson has used a combination of reinforcement learning and neural networks to create the Watson Beat. The first album "Space" is actually very melodic and has replaced my usual "meditation music" that I usually listen to when I start the working day. ( Full album on YouTube here ) Full length original article is HERE ...

3D Printing is making some 2018 progress

Image
T1ll13 is starting to manifest into reality...... (btw - those are nose supports that I need to hack off with my dremel!) Now, just on a mission to print out all the internal parts to make up the mechanisms for the movements to happen..... ..and yes, it's all in PLA as that is what works for me and this printer.  ABS is just a waste of time and effort at the moment, it rarely sticks to the bed (been through the rounds of increasing bed temp) and when it does, after a few cm's it starts to split...so I decided rather than messing around, I'll just do it initially all in PLA and then come back and print again in ABS - probably once I've got myself a glass bed in place (or a different 3D printer!) I think we might be 3D printing for a few more weeks yet...as there are 10-13 parts needed for the skull and then..and then....and then.... you get the idea :-)