Augmented Reality or Artificial Reality?

In today's world, augmented reality is generally considered by the masses to be 'just a gimmick'. Augmented reality apps are cool for a quick demo, but the vast majority of people have never used AR as part of their daily lives for much. The problem isn't even a lack of useful apps- some incredibly useful ones do exist. For example, Google Maps features an AR mode that adds virtual signs and arrows over the live view on your phone to point the way to your destination. Now that sounds like an incredibly useful feature- so why is it that you might only just be hearing about it for the first time? Because the way you interact with the software is discouraging. Nobody wants to be holding up their phone at head height, pointing it around like a mad man. It's just not a natural interaction.

Now imagine for a second that your phone is cut out as the middle man. This virtual path is laid out right in front of your eyes- utilising your entire field of view. Now suppose a cool piece of graffiti catches your eye as you’re walking. After glancing at it for a second, a text box appears next to it showing you the name of the artist. You ask to see more, and all of a sudden, other pieces from this artist are now surrounding this piece. And this is where smart glasses come in, to turn AR from augmented reality, to artificial reality.

Google started this experiment nearly a decade ago. Back in 2013, Google released the first version of Google Glass as a proof of concept. Marques Brownlee said at the time: 'the idea here is that this is a new form factor- this is the beginning of a new sort of paradigm'. This new form factor allowed for a new range of use cases, from shooting first-person action shots to handsfree navigation whilst cycling. But at the end of the day, it was still just an experiment.

A huge barrier with Google Glass was the user interface. It more or less just felt like a heads up display that you see in a video game, rather than a way to genuinely interact with the world. But we're now seeing these genuine AR interactions on our mobile devices, just like the Google Maps Live View I mentioned earlier. This is thanks to frameworks like Apple's ARKit, and Google's ARCore that have been introduced in the last few years. These frameworks help developers easily create full-blown AR experiences, of which there are now thousands. Apple's inevitable Glasses will be based around ARKit, meaning developers will be able to very easily port their existing apps to the platform, ready for launch day. This, in turn, provides Apple Glasses customers with the intuitive user experience that the initial Google Glass just couldn't at the time, but also a whole host of true AR apps.

Compute power is also much less of an issue now, as these mobile AR frameworks have proved that we can run sophisticated AR experiences with just the compute power of a smartphone. These future Apple Glasses would be an accessory to the iPhone, similar to how the Watch is, letting the phone do all the heavy lifting. But it might not be too long before Apple shifts them to be completely independent. The Apple Watch itself is almost entirely independent, with the Watch S4 matching the power of an iPhone 6S. These chips are even so power-efficient, that an Apple Watch with an always-on display lasts nearly 2 days. This puts Apple in a very good position to produce a pair of glasses that can last you all day.

Another area in tech that's seen huge improvements is voice assistants. This is important for a wearable device like this, as there's no fixed screen input. A sophisticated voice assistant goes a long way in keeping the software interaction natural. Nowadays they're sophisticated enough to adjust your lights, request an Uber, and even send cash to people. It's safe to say that the likes of Siri and Google Assistant are both miles ahead of what they used to be. In fact, Google Assistant didn't even exist when Google Glass first came to market.

The last thing that kept Google Glass from fulfilling its full potential was a social factor- our work culture. The COVID-19 pandemic has forced us as a society to shift all possible work to remote forms, with software such as Zoom and Slack being used at record highs. Many companies, including Twitter and Shopify, have even made remote work a permanent option. So where do AR glasses fit into this? Well, I believe that they can make working from home reach its full potential. You could have multiple virtual monitors right in front of you, without having to set up or invest in a home office. You could be in a virtual meeting with AR representations of your colleagues whilst you reference an AR spreadsheet. In a time like this and for a long time after, consumers will be far more prepared to embrace AR glasses than ever before.

The big question is, when will we actually see a pair of glasses like these come to fruition? Most of the pieces are in place: ARKit is a hit with developers and Apple's tiny chips are blazingly fast. If you ask me, we could very well see a first-generation pair of glasses as early as 2021. That point in time could also mark the beginning of the end for the smartphone. The iPhone and Watch both reached a maturity point at around 4 years after launch, at the iPhone 4 and Watch S4. The story could very well be the same for the glasses when it reaches its 4th generation, around 2025. Things are definitely going to be very interesting until then, as we slowly watch the acronym AR shift from augmented reality, to artificial reality.

Previous
Previous

'Automatically' Picking Icons Using Machine Learning