Bill Gates has famously said, “We always overestimate the change that will occur in the next two years and underestimate the change that will occur in the next ten.” I’m always struck by this truth when the end of the year approaches and I look back at what has changed in technology during the previous twelve months.
It usually doesn’t feel like the needle has moved much, but then when I compare it to what was normal ten years ago, I realize that seismic shifts have indeed been happening. Real-time traffic updates on my mobile phone, just as an example, are something that today I use multiple times daily as I try to dodge road congestion in my area. Ten years ago, I don’t think I even had a mapping app on my phone, let alone one with live representation of where traffic was moving and where it wasn’t. Could I live without it now? No way! That’s the kind of change I mean.
The potential of mass-market AR
Looking forward, the trick is to identify which of the tiny shifts around us today are going to be the ones that generate an avalanche of change in the future. In 2019, the standout development in this category for me has been the advent of mass-market augmented reality (AR) integrated with reality. I had two experiences this year that made me feel like I was pulling back the edge of a curtain and peeking into a vibrant, almost unimaginable future, and they both had to do with encountering AR out in the real world.
The first experience came courtesy of Apple. In August, they revealed the fruits of an AR partnership that had been developed with different artists through The New Museum. Seven different “AR artworks” were installed in six cities around the world, and you could sign up for a walking tour of these artworks, guided by an Apple representative, at the flagship store in each city.
Once we had gotten all the elements lined up just right, though, what we got to see was amazing. Looking through the phone’s camera at the world around me, I saw a ribbon ooze out of a drainpipe on the wall and begin writing words in the sky. I was able to peek inside a hole in a tree and watch the tiny, hidden lives of a man and his dog. I read poems written on the sidewalk and witnessed the entire world around me become converted to a black and white drawing. I organized boxes on a conveyor belt so that they would run in an endless looping pattern. I chased multi-colored bubbles down the street and gawked at a giant rising into the sky from the top of a nearby building.
As I saw and interacted with these visions on my phone, my mind filled with possibilities for the future of this technology.
- What if I could look through my phone’s camera and see directions for my destination written on the road itself?
- What if I could look through my phone’s camera at a restaurant and see reviews left there by my friends?
- What if I could see the history of a place written in the sky next to it? Or acted out by virtual characters?
- What if I could see what this block looked like on the day after the 1906 San Francisco earthquake?
What’s clear is that this slightly fragile technology, which currently needs to be driven and curated by a guide with a central control pad, is just one step away from utterly altering the way that we are able to see and understand the world around us forever.
Spontaneous gameplay
The second major AR experience that I had in 2019 happened just last month after Microsoft launched an early version of its mobile game Minecraft Earth. This is a Pokemon Go-style game, in that you need to physically be in a location in order to interact with the game elements there, but it’s a full generation beyond the AR that we experienced in 2016 with Pokemon Go. Minecraft Earth has AR “adventures” that you. You can look through your phone’s camera at the real world, place a structure down on the visible ground (which means that Minecraft Earth can calculate where the “ground” is), and then walk around and interact with it at full size.
Above is a screenshot that I took of one Minecraft Earth adventure: a barn with a couple of cows downstairs and a chicken in the loft. Having placed this on the ground in a field (you can see the real grass buildings around and behind it), I can now walk around this AR structure and interact with the cows and the building by tapping my screen, which in my case in this example involved turning the cows into leather (sorry, cows) and digging down into the ground, where I found some iron with which I could make a sword.
But the point isn’t the gameplay itself, it’s that I was able to have a spectacularly immersive experience with an AR construction that looks real, integrates visually with the real world, and allows me to interact with it in rewarding ways – in a mass-market app on my phone, with no benevolent guide needed to turn the AR on and off for me. Whoa. Now we’re really starting to get somewhere.
Looking back at 2019 from 2021, as Bill Gates predicts, we might not see much difference in only two years. There might be a few more apps that integrate AR with reality, a few more games that have broken out of the gaming console and can be played while you’re out and walking around.
But I believe that when we look back on 2019 from 2029, we’ll realize that 2019 was the last year in which we only saw the things around us as they actually were, not as they could be when enhanced by art or information or directions or a game or some other kind of transformative AR filter. And while the concept of these ubiquitous AR overlays might seem strange or intrusive today, I suspect that if you ask me in 2029 if I can live without them, my answer will be “No way!”
Recent Comments