What comes after the Internet? This is not so much about replacement as evolution, at least in a Pokémon sense. What is the evolved form of the Internet? The Metaverse. The coming together – or collision – of the physical and digital worlds.
I have written about this a lot in various forms over recent years, but I was inspired to address it again after speaking to fellow futurist Cathy Hackl for an episode of my podcast, Talk About Tomorrow. This episode rather accidentally became part of a trilogy of interviews on the subject of the metaverse, bookended by conversations with Steve Sinclair of Mojo Vision, maker of smart contact lenses, and John Keefe, co-founder and director of Draw and Code, one of the world’s premier immersive content studios.
I return to this topic again and again because I don’t think I can stress enough just how important it is that all of us get our heads around the Metaverse concept. So here’s a bit of an overview to compliment those three podcast episodes. I strongly recommend having a listen.
Technologies powering the Metaverse
There are multiple definitions of the Metaverse, but it was first laid out in Neal Stephenson‘s book, Snowcrash. Here, Stephenson described a virtual reality patterned after the real world. Somewhere that your avatar could walk around. Imagine Fortnite or more accurately, Population One – a virtual world projected into your vision through a set of goggles.
Today, we talk about the Metaverse as the blending of the physical and digital worlds. What does that mean? Well, multiple technologies are shattering the sheet of glass – your phone or computer screen – that has historically held the two worlds apart. These include:
Alexa and Google Home allow us to interact with the internet, ecommerce and other digital sources of data (e.g. music streaming) with our voices rather than our fingers on a keyboard or touchscreen. In doing so, voice assistants have become one of the first and most prominent ways in which the digital world has started to bleed into the physical.
Internet of Things (IoT)
It now costs just a few pence to add intelligence to an every day object: a lightbulb, power socket, wristband etc. The result is that millions of objects are now being equipped with an internet connection and computing power. This allows them to do two things:
- feed information from the physical world back into the digital
- control the physical world from the digital
Again, this leads to the blurring of the lines between physical and digital, when information is common between the two and control can flow from one to the other.
The rising power of computers combined with the ubiquity of cheap cameras means that machines are getting better and better at interpreting our physical world. They can recognise people, objects, locations and even emotions. The result is that computers and artificial intelligences need less and less explicit input in order to make things happen. For example, autonomous vehicles can ‘see’ obstructions.
As ever, I use this term in its broadest rather than its most specific sense, to mean software that is configured to take on cognitive loads that would formerly have been shouldered by humans. AI is a critical part of the metaverse because it gives objects and virtual entities the smarts to usefully interact with us, either in the physical or digital realm. Information drawn from voice assistants, computer vision, or any number of sensors can be interpreted into action. And that action can be wrapped in an interface, be it a digital avatar or a change in the environment, that is meaningful to us.
Some of this can be done by much simpler code than can be usefully called ‘AI’. But the added power means that each interaction doesn’t need to be coded manually. The system can learn, interpret, experiment and adapt.
Virtual Reality allows us to experience digital content as if it were a physical environment. Even in the state of the art, it is a long way from perfect. For example, moving around an environment beyond ducks and lunges has to be done with a joystick rather than your legs. But it is nonetheless compelling. Games engage. Virtual cinemas allow us to escape the four walls in which we’ve been cooped up. And as John Keefe pointed out, people are even (finally) turning to VR as a collaboration tool.
Augmented Reality, or Mixed Reality, is our primary interface with the Metaverse because it too blends the physical and digital worlds. If you have ever used a filter on Snapchat or Facebook Messenger, or played Pokémon Go, then you have experienced rudimentary AR. But it can be so much more. Some time in the next decade, probably in the next five years, we will begin the transition from handsets to headsets, giving us the opportunity to overlay digital items on to the physical world at any time. Virtual people, creatures, aliens, displays, interfaces and objects. There is a huge amount of design work to be done in order to create an experience that is natural, engaging and desirable. In many ways, this design challenge is much greater than the complexities of condensing the hardware. But I believe we will overcome it.
Impacts of the Metaverse
The Metaverse will be all pervasive. It will be our primary interface to just about every form of transaction and probably much more:
Today, we worry about the amount of time people spend behind a screen, lost in a digital world. I suspect that within a few years, most people will spend ten hours a day in mixed reality. This will amplify today’s difficulties but also help to resolve them. With the Metaverse you never leave the physical world, but you can twist it. You can repaint the world to meet your preferences, changing your surroundings and even the people who inhabit them.
You will never have to ask “How much is that?” Your integral AI will know what you are looking at and seek out the answer. The shop may beam a virtual assistant into your field of view, so that everyone gets a personal shopper. Maybe it will negotiate behind the scenes with your own AI for discounts based on your loyalty.
Imagine a sixth sense for your spending and credit limit. A subtle colour overlay on products telling you what you can afford, or which is the best use of your limited funds. Imagine your credit score represented in three dimensions, a monument you need to rebuild.
A personal advisor for everyone, powered by AI? No more language barriers and hard to navigate websites, just conversations. The trade-off being that they might know so much more about you.
Virtual tours available instantly, captured by the agent’s glasses and streamed to yours. High definition capture of any issues in the home. Three dimensional guides to any DIY job, from assembling furniture to fixing a leaking tap.
Who is that? Are the available? Are they a match? This one is fraught with risk.
I could go on and on because the Metaverse will touch every aspect of life. It might be the lens through which we work and study. It will be so ubiquitous that we will rapidly start to assume that everyone can access it, as we have with the smartphone.
The location of the access hardware is important here. Because it is on your head, interactions can be much more subtle. A headset can see what you see, hear what you hear, and answer the questions that those sensory inputs throw up before you even verbalise them. The AI that sits behind your AR experience, personalising your environment and picking up on you needs, will become very much a co-pilot. The shift to the Metaverse will mean the cognitive augmentation of every human who engages with it. That has implications for inequality, but also for health: imagine the help it could be to those suffering with dementia.
In summary, whatever your field you should be contemplating what this shift will do to your sector. Because I am more certain than ever that it is coming. And more than ever that it will touch just about everything.