Is Google Glass ‘Robotizing’ Tesla’s workers?
Is Google Glass ‘Robotizing’ Tesla’s workers?

Is Google Glass ‘Robotizing’ Tesla’s workers?

Not quite dead yet? Could the Google Glass ‘Enterprise edition’ help build cars at Tesla, or is it just showboating?

There’s an old comedy programme on the BBC called ‘Not The Nine O’clock News’ featuring Rowan Atkinson before his later more famous ‘work’ as Mr Bean. One favourite sketch featured car shop floor workers all calling themselves Bob before the end punchline rolls to suggest that modern cars are all built by Roberts.

While robots (rather than Roberts) have been a part of automobile manufacturing processes for many years now, their roles have been largely confined to automated sheet metal pressing and other relatively ‘dumb’ tasks. But Not The Nine O’clock News was 1979 and this is 2016, so surely the state of automobile manufacture robotics has progressed?

Of course, things have moved on and software is hugely tasked with management control for shop floor operations and initial car design functions as well. So over and above sensors to detect predictive maintenance issues, where does the Internet of Car Things fit in today?

Almost official, but not quite

The Google Glass optical head-mounted display technology is widely considered to be something of a failure across the IT industry. Expensive, buggy and not widely adopted by the masses, Google has stopped short of canning the project in its entirety to focus on the higher spec as-yet-fully-announced Enterprise Edition.

Electrek is a website tracking the transition from fossil fuel transportation to electric and the surrounding clean ecosystems. The news portal has recently reported news of electric vehicles maker Tesla Motors could be using Google Glass in an attempt to increase productivity at its Fremont factory in California.

Details relating to the exact use of Google Glass are sketchy across several news outlets currently chewing through this and related stories. It would appear most likely that the software intelligence on board Google’s headset if being used as some kind of hands-free inventory control system for workers.

One website even called it a ‘glorified management system’, suggesting perhaps that it may be some kind of cosmetic showboating on the part of those responsible for allegedly putting the system in place.

Is it Google Glass really being used?

Image credit: APX via Electrek
Image credit: APX via Electrek

In case you need more evidence, Electrek hosts a photo (shown at the above link, and here to the left) which was also publicly visible on wearable solution company APX Labs’ website for quite some time before being removed.

“The image shows multiple black Tesla Model S vehicles in view with a screenshot from APX’s ‘Skylight’ software in the top-right hand corner. This is a familiar view for those who have used Glass before. The device is actually capable of capturing similar shots itself called Vignettes. In the screenshot, the software shows what is purportedly a vehicle’s VIN number and various options for a factory worker to act on,” writes Electrek senior editor Stephen Hall.

Hands-free head-sense

So is the time right for additional Internet of Things (IoT) style intelligence on the automotive shop floor? Could new devices improve efficiency, sharpen workflows, improve safety and perhaps even increase the performance of the cars that ultimately roll off of the production line?

“There is significant potential to use hands-free wearables such as Google Glass in the manufacturing industry to increase efficiency, reduce errors, and improve other KPIs,” said Rich Mendis, co-founder of Enterprise Mobile Backend as a Service (EMBaaS) company, AnyPresence.

Speaking to Internet of Business yesterday, Mendis continued: “Imagine, for example, being able to read complex assembly instructions without looking away from what you are doing with your hands. Key components to make this happen at a broad scale are: (1) comfortable, reliable, and rugged wearable devices, (2) the ability to render content on the wearable in a manner that makes it easy for a busy worker to consume, and (3) the ability to integrate with existing IT systems in a secure, easy manner to retrieve or update data.”

Mendis points out that, some companies (such as Ford) are taking a baby steps approach to wearables on the shop floor. A CIO case study explains how Ford assembly plants are using wrist-mounted mobile apps to help accomplish the same thing you might do with a head-mounted display.

Commenting further on this news speaking to Internet of Business this week was Jim Marggraff, CEO of Eyefluence, a company focused on eye biomechanics and the eye-brain connection.

Marggraff has said that a key reason we have seen limited adoption rates in AR HMDs (head mounted displays), including Google Glass, is the lack of intuitive, effective input methods.

“Today’s tap, swipe, nod, point and talk methods for controlling these first-generation devices are inadequate.  Head mounted display devices are fundamentally incomplete without eye-interaction. The ability to transform intent into action through your eyes, for 40 million hands-free, deskless workers, will accelerate adoption of HMDs in a broad range of enterprise applications,” he said.

Marggraff added, “The ability to complete a checklist, handsfree, on an oil rig, with time-stamped and identity-stamped data entry, will improve procedural adherence in this extreme environment.  In critical care and emergency room environments, instant hands-free access to patient data, through eye navigation, will improve treatment and reduce bacterial-spread from fingers touching patients then HMD controllers.”

In Marggraff’s opinion, we can use this technology in areas like a construction site i.e. workers will document compliance by taking photographs with their eyes, matching blueprint details, and searching databases — all as fast as they can move their eyes to look and think.

In manufacturing, instant access to instructions, visual dashboards, and eye-messages, will enable workers to complete their jobs more effectively and efficiently, saving time and reducing costs.

WakingApp the neighbours

Alon Melchner, president of WakingApp contacted Internet of Business after seeing the thread of this story discussed on Twitter. Melchner is of the opinion that Tesla’s decision to integrate Google Glass into the factory floor marks the beginning of augmented reality and the Internet of Things’ penetration into the industrial space.

“With so much investment already put into hands-free hardware and technologies like voice recognition, augmented reality is the next step in worker efficiency because it can create solutions that include step-by-step instructions using visual overlays of the right information on equipment, machine, and panel operations,” said Melchner

“Moreover, AR solutions will visually enhance a worker’s access to information required to perform their job on the factory floor including cautions, tips or instructions from expert workers, schematics and any other digitised data in real-time (given that the content is created to make this happen),” he added.

Using the Internet of Things to get real-time information from devices and systems shows how, with the right content, augmented reality can be a key component of enterprise infrastructure.

There is much growth potential here whether in fact the use of Google Glass (and indeed its Enterprise Edition) is fully brought to bear. Whatever next? Driverless vehicles or something?

Want more news like this in your inbox? Sign up for our weekly newsletter!