Wednesday, 16 May 2018

Is it a real person? Does it really matter? Awesome Digital Human in Unreal Engine

Okay, so you may have noticed that recently, I made a 3D Avatar / Digital Human using the Unity 3D engine and connected it up to the IBM Watson SDK.

Well, straight after doing that piece of work, I thought...hmmm... I also have the Unreal Engine installed on my laptop too, I wonder if there is an IBM Watson SDK sample app that could use that engine to do what I did above...

A quick DuckDuckGo search later and yep, here is a gitHub.

A quick 50,000ft difference check is - what are you more comfortable writing your code in?
C# from Microsoft (Unity) or C/C++ (Unreal Engine)
What have you had more experience in using?

So, why am I pointing this out?  Well, I happened to stumble over a Digital Human that is created and controlled by the Unreal Engine....and well, it's pretty awesome!
My Unity / "couple of days" effort project versus this (probably) quite a few £££££ effort doesn't really compare - but what it does show is that this level of technology and effort is moving at a very fast pace!

The same backend can be used (ie. IBM Watson Services), it's just the front-end UI and interactions that can be de-coupled and "re-bolted on" as the technology advances.

See for yourself:

Here's the technology from Cubic Motion:

Here's a longer video about how advanced the Unreal Engine now is in relation to the quality of Digital Humans and interactions.....

Time to brush-up on my C/C++ skills ( I really need to?!)

Tuesday, 8 May 2018

mini raspberry pi handheld notebook

most certainly not a "new" idea, but I eventually got around to putting this together:

I 3D printed out the parts, takes about 6-8hours in total.  I already had one of these bluetooth keyboards knocking around from way-back-when, it's kinda cool as it has a little mouse pad on the right hand side and if you set up the Pi properly, it picks it up on boot-up and "just works".

I have a 3.5" touchscreen and an RPi3, albeit not the same screen as shown in the article, (but I'll come back to that later).  I already had a battery power battery from another project that didn't go anywhere, so it seemed like a good idea to throw all these parts together and make this.

The one gotcha I currently have is related to the hinge screws/nut things, I don't have what is needed - I've gone through every "box of bits" I have - what I did find was 2 pop-rivets that fit just right, but they don't obviously stay in place - I'm sure I'll figure out something over the next week or so... or find something else I can take apart and find some "donor parts" that'll do the job - I love recycling :-)

anyway, I printed this out, the keyboard surround bottom freaked out a bit on the first 3D print, so when I put the keyboard inside it actually split and broke - oh, no! I can repair it, but as my missus says, "why don't you just print it out again?".  good point.  When I return back home later in the week, I'll set the 3D printer up again and print it out again....

But, the good news is, I did actually fit everything inside the lid/screen part - I sort of followed the adafruit article, well, the essence of it - I ended up having to move things around abit as my parts weren't exactly the same, but the wiring diagram was helpful.  As I said earlier, my screen was different so it connected directly onto the RPi3, I had to solder directly onto the #2 and #6 pins (underneath) and my power battery had a massive switch connected to it (that I bodge-job fitted for a different project) and I didn't have a microswitch for turning on/off - all I had was an old-fashioned metal flick-switch that I think looks more CyberPunk :-)

After having much fun with a soldering iron, super-glue (and then realising that was a silly idea) and some of those white-pad things (for holding my different sized screen in place) and a couple of random screws for the lid, I have a working and functioning little RPi3 notebook - with a touchscreen!

Okay - it's nowhere as cool as my Open Pandora (that it turns out I've had for 10 years now - YES, 10 years, this was sooooo advanced even back then)

But, it's a fun little project and it's portable too...I can plug Arduino devices into the USB connections, now I'm starting to think of some interesting other projects to use it for now.....

I'll update with some photo's of my actual device later in the week.

right, time to listen to some music:

[UPDATE: Here's photo's of my real device in making....]

A, from the back photo, to show the RPi3 internals

The first 3D print of the keyboard casing

As you can see, it didn't come out too good, so caused a break

3D printer fired up again...

..and under way

I gotta tidy up my desk at some point!

...and there we go, printed and fits nice and snug

side shot showing the USB connectors still available to use

and there she is all booted up and working as it should!

The keyboard has a mouse pad, but it's also touch screen too

of course, I'm straight into Terminal  :-D

There we have it, all working as it should..... well....until I downloaded VLC and started to play some music.  I hear nothing.  I switched from HDMI to Analog thinking that was the issue.  Nope.
I plugged in some regular earphones to the 3.5mm socket and the sound was working fine.
I think I may have fried the PAM8302A sound amplifier, so I've ordered a new one from Amazon, should arrive tomorrow, will swap it out and see if it does the job.  It should do.

Then it'll be "all working as it should"... will video it for proof.

and boom! somehow I managed to fry the OS installation...won't boot now.  okay, time to put NOOBS back on and set it all up again.
I must remember that I need to setup Bluetooth to bond to the keyboard, then visit here to download the screen Driver and run ./LCD35-show, then a reboot and all working again...

Friday, 27 April 2018

Create a 3D Digital Human with IBM Watson Assistant and Unity3D

Totally, 100% selfish plug :-)

Have you ever thought to yourself:

y'know, I want my own HOLLY (from Red Dwarf)

I have "skills".....

I'm a computer software developer.....

I have "stuff" on my laptop that I've installed / used a few times, but forgotten what it was for again....

.....and you have a spare 30-40 minutes?  (and $35)

Then look no further, welcome to the world of Unity, UMA2, SALSA, IBM Watson SDK and the IBM Watson Assistant API service!

I wrote an article that may be useful to follow if you want to do the above sort of thing on a rainy Saturday morning....

(and yes, I know the TTS is not that accurate, I didn't give it much/if any training, so I'm surprised it did as well as it did!)

Now, go forth and make your own DIGITAL HUMAN AVATAR.....

Thursday, 26 April 2018

Therapy Chatbots are Transforming Psychology

Original article available HERE

As the article says: "Chatbots are having a significant impact on numerous fields - especially the psychology sector.  Developers caution these tech tools aren't a replacement for human interactions with experts, but it's already clear chatbots are an always-available resource - which isn't the case for human health practitioners."

Checkout WOEBOT
"Woebot is an automated conversational agent (chatbot) who helps you monitor mood and learn about yourself. Drawing from a therapeutic framework known as Cognitive Behaviour Therapy, Woebot asks people how they’re feeling and what is going on in their lives in the format of brief daily conversations. Woebot also talks to you about mental health and wellness and sends you videos and other useful tools depending on your mood and needs at that moment. You can think of Woebot as a choose-your-own-adventure self-help book that is capable of storing all of your entries, and gets more specific to your needs over time."

Checkout WYSA

"Sometimes we get all tangled up inside our heads, unable to move on. Wysa is great at helping you get unstuck. Co-designed by therapists, coaches, users and AI folk, Wysa lets you set the pace, helps when it can, and never judges. It is free and anonymous - so give it a try!"

The world of Chatbots is definitely changing and getting more useful too....

Wednesday, 21 March 2018

IBM's Watson-based voice assistant is coming to cars and smart homes
IBM's Watson-based voice assistant is coming to cars and smart homes

This all sounds very familiar :-D


One of IBM's first partners Harman will demonstrate Watson Assistant at the event through a digital cockpit aboard a Maserati GranCabrio, though the companies didn't elaborate on what it can do. In fact, IBM already released a Watson-powered voice assistant for cybersecurity early last year. You'll be able to access Watson Assistant via text or voice, depending on the device and how IBM's partner decides to incorporate it. So, you'll definitely be using voice if it's a smart speaker, but you might be able to text commands to a home device.

Speaking of commands, it wasn't just designed to follow them -- it was designed to learn from your actions and remember your preferences. If you allow the Watson-powered applications you use to access each other's data through IBM Cloud, which delivers and enables Assistant's capabilities, they can learn more about you and deliver life-like conversations with context.

In IBM's sample scenario, it said Watson Assistant can automatically check you into your hotel and make sure your rental car (so long as it has a Watson-powered console) is ready as soon as you walk out the airport. The car's console can suggest locations to visit en route to your hotel, as well. If the hotel uses a smart assistant powered by IBM's AI, then it can automatically tweak your room's temperature and lighting based on your preferences and even start playing music you like when you're almost there. The hotel room's (Watson-powered) wall dashboard can also display your schedule and emails before you even walk in using the electronic key automatically sent to your phone.


Tuesday, 20 March 2018

Automated harvesting by agricultural robots

This is something close to my own heart and something I'd like to get more involved with in the future.

I like the idea that the robot harvesters can use Visual Object detection and then Recognition to examine the fruit and determine if it needs picking - how it does that picking raises an eyebrow, but mixed with robotic arm and pressure sensitive 'finger' tips and some good coding it'd be feasible.

Then, in the grocery shop as a customer, you can whip out your phone, scan the fruit on the shelf and it can "highlight" which fruit will ripen when, allowing you to get the best choice of fruit to meet your needs.  ie. do I want to eat it today? actually, I want to eat 3 of these in 3 days time, which are the best 3 to select?

sounds all very cyberpunk....

Friday, 9 March 2018

Program in C

Frank, I doff my cap to you Sir, awesome find and very relevant!