Google IO 2017

Google IO 2017

Google IO is Google’s annual developer conference held in San Francisco. This year I attended Google IO Extended which happens all around the world at the same time as the main IO event, it’s designed for people who can’t make it to the main event but want to know the latest stuff.

There was one main theme this year from Google, and it’s summed up in this phrase:

“Mobile first to AI first”

In every area that Google spoke about (from new processing hardware, home automation to Android devices) everything had been improved by AI!

Another nice fact they mentioned was, Android now runs on over 2 billion devices and 82 billion apps were installed last year.

Below are some of the big headlines!

Google Lens

A new app designed for your phone, point it at something, be it a flower, restaurant sign or a WIFI label and it will understand it, identify the flower, show the menu for the restaurant or automatically join the WIFI! It can also translate languages on signs.

They also showed a cool demo where the AI could detect obstructions (a wire fence) and remove it from the picture). This is a huge leap in computer vision.

Google Home

Google home seems to do a lot more than I realise, for instance, it can recognition up to 6 different people in a household and customise the experience for each one. Now, Google is adding phone calling to Google Home for free. Only available in the US currently, you can just ask Google Home to phone your mum for instance and will recognise who you are, and find your mum in your contacts. If your partner does the same thing, it will phone their mum, not yours.

Another new feature is visual responses, which is super cool. You can ask Google Home something, say “what’s my calendar look like today”, and Google will display it on a Smart TV, Chromecast or Google connected device. I really think this will become super useful. You could ask Google Home, how long it will take to get to somewhere, then tell it to send directions to your phone.

They also introduced something called Proactive Assistance, the idea is that Google Home will detect things that may be important to you and let you know about them via a visual light on the device, for example, if traffic is really bad and you have a meeting coming up soon.

Google home now integrates with over 70 smart home manufactures

Virtual Reality

Google already make a VR framework (Daydream) and a headset to fit onto your phone, this year Google announced 2 stand alone (no phone, pc etc needed) VR headset coming out this year and have partnered with HTC (who make the HTC Vive VR headset) and Lenovo who make their project Tango tablet (3D mapping / AR). What’s very interesting here is that they are bringing out their own indoor tracking solution that does not need external sensors. They call it VPS (visual positioning system) which I believe could be an advanced version of SLAM.

They also announced that the new Samsung S8 will support the normal Daydream VR headset, which I found odd as Samsung are in partnership with Oculus (Facebook, direct rivals with Vive) and already have the GearVR.

Augmented Reality

Google announced another Tango handset (it’s like a Microsoft Kinect embed into an android tablet) and announced Expedition, which brings AR to the classroom. Kids will be able to place 3D augmented objects within the classroom, for example see how volcanoes explode.

Suggested Sharing

Suggested sharing is a new feature for Google Photos that uses AI to detect well-taken pictures, and who is in them. It then suggests / reminds you share that picture with the people in it. It forms an online collection of all the images, so you finally get to see images with you actually in them (if someone else took them). There is also an automatic mode, for example if you always want to share pictures of your kids with your partner. Feels a little scary to me.

Cloud TPU’s

So, anyone in computing will know what a CPU (central processing unit) and a GPU (Graphics Processing Unit) is. Google likes to do their own thing and last year announced the TPU (Tensor processing units) which are designed to be very quick at machine learning processes. Google are now calling them Cloud TPU’s and each one can do 180 teraflops.

Android O

There were a few new features mentioned in the keynote but nothing I found too exciting. They mentioned picture in picture, and notification dots, both of which iOS already have. They mentioned Android Studio 3 and supporting kotlin as a first class language, again, I guess it’s their answer to Swift for iOS. There was the usual focus on battery useage, security (Google Play Protect) and making apps run boot faster. They say they have seen 2x improvements on apps running. Google has also improved Copy and Paste features so that it automatically recognises address, company names, phone number etc which in all honesty I thought it already did.

iOS Support

Throughout the presentation, whatever new stuff they demo’d they kept making a point that it’s also supported on iOS, not just Android (Google Assistant, Google Photos, Daydream etc) which I personally thought was cool.

Lastly and probably the one that made me laugh the most!

YouTube

Youtube for TV and consoles will now support 360 video including live events, Youtube viewing on TV has gone up by 60%. However, the big news is Super Chat and Trigger Actions.  

Super Chat allows you to pay for your comment (to a Live Youtuber) to be noticed, so if you really want to ask that question, you can pay for it. Not too bad, I guess. But Trigger Actions allow you to pay to trigger something in the live video, so throwing a water bomb at the presenter or turning the lights off in their house. I can see this going down hill pretty fast.

VEX Worlds 2017 – Robotics Competition

Sorry for the late post about VEX Worlds, I thought I would have more time after worlds to catch up with stuff, sadly (well not really), the kids have been mega active.  My eldest son played his first football tournament, had a holiday, lots of family stuff!

So, VEX Worlds, what an amazing experience, I went along for the VEX EDR side of the competition (this year it was split EDR / IQ) as I was showing off the EDR Tank.  Sadly I had to leave the US early as my son, Max was ill.  Still a very cool experience!

So, the EDR tank, well it performed really really well in remote control mode.  I mean the thing was fairly slow but must have covered MILES!  The batteries never died on me, nor did any motors!  I did kill a few Omniwheels, however, that’s to be expected.   Even though I left early, the EDR tank did not and so others drove it around.  I have not received it back yet to see how bad it is now, but I am sure it will be fine.

The autonomous side was a bit of a failure, to be honest, and looking back I had set up myself to fail and I will explain why.  The autonomous side was using ROS (Robotic operating system) which is an industry standard.  I was using a Neato Lidar system which is awesome however it only had a range of 5 meters and SLAM (simultaneous localization and mapping) to work out where I was, and where I needed to go via building up a map.  SLAM works by detecting features of the surrounding area to work out where it is.  When you’re in a hall that’s hundreds of meters wide with very little features, a sensor with a range of 5 meters is practically useless.  In the end, I just showed kids how it worked on my laptop using RVIZ.  If I had to do this properly I would need to invest in a proper LIDAR system with a much greater range.  Another aspect which makes this very hard is all the people moving around, how can SLAM pick up features if they are constantly moving!

Overall, the EDR tank was hugely popular, I gave tons of fist bumps, high fives, etc, people just thought it was cool, just a little slow.

Next year, if I did a vehicle again, I would have to make it a lot faster and forget about advance sensors etc!

Here are some videos of VEX World and the EDR TANK:

This weeks update : It’s a lie : Metabase, more VEX and some data science!

So, let’s get the lie out of the way, this week’s update could cover more or less than a week!  It is whatever I am thinking of at the time, that may or may not be happening.  So apologies for that bombshell.

Software

So, at work (O2’s Innovation Lab) I am currently learning data science stuff, for anyone who knows me, this is an extremely hard task as I have the focus of hamster on Redbull.  I am usually doing more than 1 thing (usually 5) and so it can be a struggle to learn a new skill, let alone one as difficult as data science.  This week, I would say I am starting to get somewhere.  I been using different classifiers across my data, checked its score and then looked at the confusion matrix.  What that told me was that my data sucked badly, however, the upside was I could prove that my data was terrible.

Another thing I am doing at work around data (oh look at my focus) needed me to take some data and put a GUI over the top for people to be able to easy “ask the data questions”, I found a really cool free tool called Metabase which worked really nicely.  All I needed to do was take an MS Access DB (oh boy who uses MS Access), convert it to CSV, and chuck it in a Postgres DB.  Would have taken 5 mins on a PC, a Mac took a little bit longer!

Robots

So what’s new on the robot front this week? well VEX Worlds is in less than 25 days and the software is erm…. still in development.  The EDR Tank should be on way to the US, so I made a mini version of it so that I can carry on with the development.  I have written some safety features into the software so that I don’t mow down innocent kids, mouthy kids, will, of course, run over!  The nex thing I need to do is finish the bridge between the VEX Cortex and the ROS software

ROS

I have a new friend on Facebook, (whoop whoop) who has been helping me with the ROS stuff, it’s useful to have a sounding board on learning new stuff, especially something as complex as ROS.  I have a fear that the VEX Tank may not work too well with all the people moving about.  Slam and autonomous driving works (very simple form) by identifying features in the environment to try and locate itself. when you have no real features (e.g a long corridor) or lots of things changing (e.g people moving about), it can get very confused.  I am sure robotics engineers have a good solution to this, but being a beginner and using Hector Slam for the first time,  I am not holding my breath.  My mini raspberry Pi / LEGO version got confused if I farted near it, let alone 10,000 kids running around!

Ending Notes

I started a statistics course as its the precursor to the Udacity Machine Learning Course.

I finished a Sentiment analysis course, pretty interesting, showed how to work out if a review to a film was positive or negative.

I watched Logan, was very good and rather violent and definitely not for the kids

I watched Kong, was pretty good but preferred the previous one, which to be fair is nothing like the new one.

I started printing the Inmoov project 🙂 THE BEST 3D Printed project in the world!

Robot making schoolgirls set for world championships

Nothing made me happier to see a BBC article about these 2 school girls.  They have qualified to go to VEX Worlds next month which is held in the US to compete against 500 other teams from all over the world.

I have been lucky enough to meet these two (aged 8 and 9) a few times when I have been a VEX Judge.   They are extremely talented bright kids for their age.  Each time I have met them, they have walked away with an award!

For the full details please check out http://www.bbc.co.uk/news/uk-england-beds-bucks-herts-39410855

New Toy : Shapeoko 2 CNC Machine

It’s my birthday very soon and so I decided that the main robotic making tool I was missing was a CNC machine 🙂  A friend of mine had a Shapeoko 2 which he no longer needed and so I felt it was only right to steal it from him 🙂  The cool thing is, he builds robots to, really cool ones 🙂

Many people like my wife, family, friends have asked me what I am going to do with it, that’s a question that I am still working on.  It can make custom PCB’s which could be cool if I finally build and release a custom robot for people (a plan I do hope to do this year), also, I could use it to cut out parts to make robots, this seems like the logical answer.  All I know currently is it looks fricking cool and spins really really fast 🙂 Watch this space!

 

 

 

This weeks update :) Big Bang, VEX Worlds.

So, last week I was at the Big Bang Science Fair, I was asked to be a judge for the VEX Robotics National Championship 🙂


I also managed to get VEX the VEX Tank ready to ship it to the US for VEX WORLDS, build-wise, it’s complete, software wise, it’s not 🙁

My flights have been booked for VEX WORLD for next month, so full speed ahead to finish the software!  Anyone know ROS?

 

Big Bang Science Fair 2017 @ the NEC

Every year the VEX National Robotics Championship is held at the Big Bang Science Fair.  This takes place at the Birmingham NEC in March.  Last year I attended as a VEXIQ superuser showing off my VEXIQ Skateboard, this year I have been asked to be a VEXIQ Judge under O2 who I work for.  I have done judging a few times, and it is such a great experience, you get to meet some super cool kids who make the most impressive robots.  It is also brilliant that my work (O2) support this, they really have a mission to help out kids.

Anyway, I will do a full write-up of the event, once it happens 🙂  Please check out The Big Bang Fair.co.uk/

 

 

The VEX EDR Tank

Over the last few month’s, I have been building a vehicle out of VEX EDR with the aim of it to self-drive. This is my first big project with VEX EDR, I usually build large stuff out of LEGO Mindstorms or VEXIQ. My view was that VEX EDR would be easier as it is a) made out of metal as opposed to plastic, and b) more powerful. I thought it would take me a week to make the vehicle, and the rest of the time would be on software. I had to also learn ROS (Robotic Operating System).

It actually turned out more challenging than I expected. Due to my lack of experience with EDR, I just assumed metal would just be stronger, and the motors would just work. However the first version of the Tank collapsed under my weight, and it took around 10 versions to get it to move me (95kg) without the motors shutting down after 5 seconds. Unlike LEGO and VEXIQ, EDR motors had a protection circuit in (PTC) which shut down the motor if it gets too hot or draws too much power. This is, of course, a good feature to protect the motor, however, it made my project very difficult.

I went from 8 motors, direct drive to 14 motors geared down to 2.44. The 8 motors could move me at a rapid speed but would just shut down after a few second.

Here is the progress of how it went

Current state

So I managed to get the Tank to a point where it worked, it was not as fast as I hoped however it seems reliable.  Next is the software.  ROS is a big subject to learn, there are lot of books on it and it is not the easiest thing to learn.  I have actually made some good progress on this thanks to the community.  Stay tuned to my next post about ROS.

 

 

The Come Back : Bringing Robotics to Burf Development

So people, after some long thinking, and some random building. I have decided to shut down Burf.org.uk and move the content to here and focus my effects on making this site good again!

LEGO, Robots, Arduinos, anything crazy will be now featured on here.

I first want to do some posts on what I have been doing up until now etc, which should come shortly. Then fall speed ahead on new projects!

Hack24 new version coming

So, seems people liked it so I am going to start making a new version of Hack24 in Unity

Building it in Unity will allow me to get past a lot of my performance issues.

If you enjoyed the game, please let me know what you liked and what you want in the next version!