Monday, 12 February 2018

PWAs...hang on, did time stand still and then loop back on itself?

Back in the day, I purchased one of these phones.  (Wow! was it really in 2013?...I suppose it must have been)
The cool thing about it at the time was, it wasn't Android, Apple or Microsoft.  It had it's own OS and had the potential to investigate "other options" for apps.  As it was backed by Firefox / Mozilla, it made sense that it was driven by the Web and more specifically, Web apps that run in web browsers.  Hey, that suited me fine, s'what I've been mostly doing since, well, since, for far too long.

Whilst I have no issue, when the need requires,  to code for a native Android java app, or flip my head the other way and code in Swift for an iOS device, it did kind of irk me that I had to start following the code religion camps again... it also conflicted that I wasn't focused on just coding for the device itself, but having to code for the server-side APIs and usually a web app that also offered the same functionality.

Now, maybe it's my issue and not other peoples (most likely), but when I want to make something pretty fast, I fall back onto what tools / coding languages that I can use to help with that and to me that's good old HTML, possibly some JavaScript Framework and some UI Framework too...

Back to the ZTE phone - to make an app for this, you basically built a web app that you could host, like any other web app, on a web server, you just needed to create a manifest file that would get downloaded and the phone would then create an "app icon" on the home screen and when you pressed it, it would load the web app just as you would from the web browser - there were some little tweaks you could do to start to make the web app more "mobile friendly" and it showed some great creative thinking..... it also handled local caching and the concept of "offline" usage too.  (Which raised my eyebrows back to the good old AvantGo days)

...then the Firefox OS and ZTE phone's went into decline and so did this novel approach to this progressive approach to web app creation for mobile devices.  The Android and iOS app stores then got flooded with millions of native apps and the common census was that was the way to go.
Tooling like Cordova was mocked by the purists on each platform (yawn), and no-one really wanted to work outside the "in crowd" (okay, maybe there was myself and quite a few others, but you get the idea - most people just follow the crowd of least resistance) .

Hey, I even built native Android Java apps for very high profile clients in San Francisco and native Swift iOS phone/tablet apps for major car manufacturers in Europe.  Sometimes, you just do what is needed and for both those occasions native was what was defined.  The amusing thing being, both projects then evolved to require a server-side nodeJS REST API component (JavaScript) and a matching web application (AngularJS & Bootstrap).  Now, those skills are complimentary, HTML/JavaScript/CSS and JavaScript again.... so we'd need more people involved and you know what happens when you add more people to the mix.... ;-)

Now, the above was back in mid-2016, early-2017.... (I haven't done much/any mobile apps, okay, I did a couple for some examples to demonstrate Watson APIs, but nothing major, just come simple 3hour coding apps).

Now, zoom forward to February 2018...and lo-&-behold I see this article:  What are Progressive Web Apps
and on the same day, I then get an invite to an Ionic 3 PWA YouTube live-stream...whoa?!what?...

Okay, having a bit of a read through, it seems like the early days have 'progressed' rather well and it looks like some sense is coming back to the world.

Google is even working on (and I seem to remember this from the "back in the day" era) a WebAPK app that converts your PWA to a native APK file to install as an app.

Apple is also allowing for adding PWA features into it looks like the planets might finally be starting to align.

I also note that my favourite Mobile / Web app UI tool IONIC are going the way of the PWA:

As I was saying earlier, you had the mobile app developers who had to learn and become experts in a specific language and platform and all of it's oddities/quirks and they were able to make re-usable things which helped with speed of development and deployment..... but and here's the boring business-headed sided view of things.... what happens after that?
Well, you have the "now I must submit my app for review to the App store" cycle.  Where you have to wait for "them" to assess, review and possibly reject your app.  then it can appear on the "App store".
Then you do some more work and want to release an update (big or small), you have to go through the same cycle, time, cost & effort being munched up....
Then you realise that you need an urgent fix, you have to repeat the above.  There is NO guarantee that the end users will actually download your latest update and install it - I know that I quite often swipe "that way" and ignore the "you have 23 updates to install" on my phone.
You now have to support multiple versions of your app on multiple different device on the same platform and if you have decided to make apps for both platforms you need to manage all of the above...hmmm... starting to get rather expensive all this and for not a lot of return on benefit.

Whereas, back in the good old days when you had yourself a static web site, you could just FTP a set of new files and job done.  Then you morphed to a web app, that had some static but mostly dynamic content, you could update the content and boom, it was there, it was live.  If there was a structural change, you could release that on it's own and it was up to you to approve it.

Maybe, by going forward so quick, there is a realisation that you are now silo'd into 2 companies controlling the apps that get to the end users and you are now paying quite a lot of money to manage and maintain your apps to use these 2 companies to get out to your user base.  Did you really realise the lock-in that you were walking into?  Probably not.
Also, if you were to launch a new app today - you'd get buried inside millions of other apps and your target audience user might not ever get to you app because of all the noise.  They might know your URL (after all, wasn't that also the big thing before: "get your web domain, so everyone will know your brand and how to reach you" - that seemed to have been thrown out with the bath water).
Back "in the day", you could just make your web site/app, host it someplace, get traffic to it via different marketing methods and people just used didn't need the overhead of "app stores"...  Revolution time, well, small/tiny revolt time...

As Ionic quite nicely put it, visit here as:

It seems to be the hipster thing to be doing....but like all hipster things, they are just those youngsters doing the thing that us oldsters were doing previously, but were told we were uncool, until now, when it is now cool....(oh, I don't know!?)  All I do know is that I'm sticking with PWA for 2018!

Yes, I'll probably do some article soon on making a PWA that calls Watson REST APIs from both the Desktop web browsers and Mobile browsers and see how to cater for both from different devices.  In fact, I'll probably start doing that in the evening this week, as I have a requirement for it.....

until then HEREs a walkthrough of taking an existing Ionic app and making it PWA.  (did someone say, service-worker.js?! more on that later!)

oh, and for the people who remember AvantGo....I find this little nod to the invention of AvantGo amusing:

Thursday, 8 February 2018

Self parking slippers from Nissan

original article HERE.

At first glance, the ProPILOT Park Ryokan looks like any other traditional Japanese inn, or ryokan. Slippers are neatly lined up at the foyer, where guests remove their shoes. Tatami rooms are furnished with low tables and floor cushions for sitting. What sets this ryokan apart is that the slippers, tables and cushions are rigged with a special version of Nissan's ProPILOT Park autonomous parking technology. When not in use, they automatically return to their designated spots at the push of a button.

For its primary application, Nissan's ProPILOT Park system uses an array of four cameras and twelve sonar sensors to wedge its host vehicle into even the smallest of parking spaces—whether it's nose-in parking, butt-in parking, or trickiest of all, parallel parking. It seems unlikely that the slippers use quite the same technology, although Nissan does suggest that the technology is at least similar, which would mean that the slippers are operating autonomously rather than relying on someone off-camera with a remote control. If you'd like to investigate further, Nissan is offering a free night for a pair of travelers at this particular ryokan, which located in Hakone, Japan—a lovely place that you should consider visiting even if self-parking slippers aren't on the amenities list.

Tuesday, 16 January 2018

NAO robot and IBM Watson services

You know I like Robots.

You know I like IBM Watson API services.

Checkout how to get those 2 working together - there's a full description of everything needed in the github repo

(watch the YoutTube replay of the Live cast HERE)

I'm going to take this as a baseline and instead of using the NAO, I'm going to use my M1ll13 robot-head.

oh, btw - I've made some really good 3D printing success and building - roll on the weekend when I can get those web-cam eyes hooked up and working too!

Tuesday, 9 January 2018

Using machine learning to spur human creativity is so 2018

You may recall my Digital Music Tangent post a little while back?....  It may have sounded a little odd and off-the-wall, but it's funny how things sometimes connect up in the universe.

Welcome to the IBM Watson bEAT.  You can use this open-source software to create rich compositions from simple input melodies.  That's right, if you can hum it (and record it), you can get Watson to then take these elements and combine to make some non-man made music.

Check it out and run it on your own laptop and get creative:

Using machine learning to spur human creativity is so 2018! Watson has used a combination of reinforcement learning and neural networks to create the Watson Beat.

The first album "Space" is actually very melodic and has replaced my usual "meditation music" that I usually listen to when I start the working day.

(Full album on YouTube here)

Wednesday, 3 January 2018

3D Printing is making some 2018 progress

M1ll13 is starting to manifest into reality......

(btw - those are nose supports that I need to hack off with my dremel!)

Now, just on a mission to print out all the internal parts to make up the mechanisms for the movements to happen.....

..and yes, it's all in PLA as that is what works for me and this printer.  ABS is just a waste of time and effort at the moment, it rarely sticks to the bed (been through the rounds of increasing bed temp) and when it does, after a few cm's it starts to I decided rather than messing around, I'll just do it initially all in PLA and then come back and print again in ABS - probably once I've got myself a glass bed in place (or a different 3D printer!)

I think we might be 3D printing for a few more weeks there are 10-13 parts needed for the skull and then..and then....and then.... you get the idea :-)

Monday, 18 December 2017

Using ROS to control GPIO pins on a Raspberry Pi 3

As the title says, it was time for me to document how to do the "simple" task of using ROS (Robot Operating System) to turn an LED on/off (well, we can do a LOT more than that by using the GPIO pins, but as a baseline, it's a starter).

Because I've installed ROS onto the Raspberry Pi itself, I can create ROS nodes directly on the RPi.
So let's get on with it and create a simple demo to blink an LED using ROS topics from the RPi.

First step, we have to download "wiringPi"
$ git clone git://
$ cd wiringPi
$ sudo ./build

The interesting thing is that everyone decides how they are going to refer to the GPIO pins on the RPi...I just keep track of the physical pin (as that doesn't change!)
And here is the GPIO pin layout in relation to wiringPi:

Now that we've installed the library we're going to be using, let's switch to creating a ROS package for the LED blink demo.
I've already created a workspace previously, as I was using the ROS ROBOTICS PROJECTS PDF as a walkthrough for the USB web cam sample (which worked great from an Ubuntu VMware, but fails on a real RPi...).
Previously, I was using the RPi to connect to the Servo controller and have used quite a few of the GPIOs already, so, I just decided to unplug a GPIO and re-purpose it for this example.
I plugged into physical PINs 6 and 10, so that is actually GPIO 12 and 20...

As I had already created a workspace I could just re-use it, but I'll re-document it here just incase I have to rebuild the SD Card (again!)

$ mkdir -p ~/ros_project_dependencies_ws/src
$ cd ~/ros_project_dependencies_ws/
$ catkin_make
$ source devel/setup.bash
(also remember to add that source command to the end of the ./bashrc file)

$ catkin_create_pkg ros_wiring_example roscpp std_msgs
$ cd ros_wiring_examples
$ mkdir src

Now create a file called blink.cpp, with the following content:

That code will SUBSCRIBE to a topic called /led_blink which is a Boolean type.  If the value is TRUE, the LED will turn on, otherwise it will be off.

Navigate up to the ros_wiring_examples folder and you need to edit the CMakeLists.txt file to be lie so:

Now, we need to re-run the catkin_make command, so:

$ cd ~/ros_project_dependencies_ws/
$ catkin_make
$ source devel/setup.bash

That's it, we're done.  We can now run and test this.  One thing to remember, on the RPi, when you are interacting with the GPIO, you need to be the root user.

$ roscore

$ sudo -s
$ cd ~/ros_project_dependencies_ws/build/ros_wiring_examples
NOTE that the folder is BUILD and not in /src
$ ./blink_led
(this will execute the compiled/built blink_led application built from the blink_led.cpp from above)

If we take a look at the RPi, we see the LED is "off":

TERM3:  (sending a "1" will turn on and a "0" will turn off)
$ rostopic pub /led_blink std_msgs/Bool 1

Now, if we look again, we see the LED is "on":

$ rostopic pub /led_blink std_msgs/Bool 0

and if we take a look at the blink_led app output, we can see that the console log shows the behaviour from above:

yay! we can now turn an LED on and off by publishing a value to a rostopic.... now....that's the baseline that we can build upwards from!  Controlling servos, sending signals to perform alsorts of actions can now happen!....

3D Printer arrives

After much hassle and delays and patience.....the 3D Printer arrived.

Well, yes, it arrived on a Sunday afternoon.  Whilst I was out.  At Cheddar Gorge, because, no-one is going to deliver anything on a Sunday afternoon are they? it's safe to go out and get some cheese....

nope.  I came back home and found this box on the doorstep.  Yes, it was very wet and soaking up the water.  I pushed it into the porch....
I smiled and thought of positive things, which as it turns out was the right thing to do. The box had a box inside the box, so the 3D printer was well protected.

I did the classic "sweep everything off the dining table" manoeuvre and set about putting it together:

I stuck the SD Card in the side and selected a file that ended in .gcode - I had absolutely no idea what it was, but I just wanted to test everything was okay:

After running through the warm up and levelling the print bed, it looked like it was time to go:

Initially I didn't think anything was happening....

..and then it started to print the base layer:

 ...after some time it had grown quite a bit:

I was still none the wiser on what was actually being printed:

It was now 30% through:

...and still none the wiser?!

hang on a minute, it is starting to take on a shape:


It was pretty impressive to watch:

..mesmerising, in fact:

and there we are, all done.  First print with PLA.  "A-Okay"  :-)

I was impressed that the hand actually had the same lines as in a proper hand (easily pleased, aren't I!)

Okay, that was the first Sunday evening session Monday morning has arrived and it was time to move it into the "office" (upstairs).  Oh look, there is a 3D Printer sized gap on the corner of the desk - I wonder how that happened? :-D

Yep....that can sit there printing away, whilst I'm "busy working":

pan out further and you realise there is more going on at "the desk" than just 3D printing:

right, time to order all those servo's, 10Kilos of ABS filament and braided fishing line...ready for the next 3 weeks of printing for the INMOOV robot!