GET IN TOUCH WITH PAKKO, CREATIVE DIRECTOR ALIGNED FOR THE FUTURE OF CREATIVITY.
PAKKO@PAKKO.ORG

LA | DUBAI | NY | CDMX

PLAY PC GAMES? ADD ME AS A FRIEND ON STEAM

 


Back to Top

NUCL3AR // DIGITAL ARTS

Here's why augmented reality will be the gateway to metaverse - BusinessToday

Here’s why augmented reality will be the gateway to metaverse – BusinessToday

There are those who say “the Metaverse” is the future of society and will transform all aspects of life in the years to come. Others strongly disagree, saying the Metaverse is a niche technology that will be limited to gamers and socialising teens. The fact is, both views are correct. 

The disconnect is because the word Metaverse means different things to different people. To solve this, we need to make our language more precise, as the industry is actually pursuing two very different concepts: the virtual Metaverse and the augmented Metaverse, each of which will have a unique rate of acceptance and a very different impact on society. 

But first, what is a Metaverse? Having been involved in VR (virtual reality) and AR (augmented reality) from the very early days, I’m often asked how I define key markets and technologies. For Metaverse, I define it as follows: 

A Metaverse is a persistent and immersive simulated world experienced in the first person by large groups of simultaneous users who share a strong sense of mutual presence. It can be entirely virtual (a virtual Metaverse) or be a rich virtual layer added to the real world (an augmented Metaverse). 

Some people would go further and insist that a Metaverse must also be a general purpose world, and not application specific, and that it includes rules of conduct and an economy. Whether you add those limitations or not, I believe the virtual Metaverse will be increasingly popular for gaming, entertainment, and socialising but will be limited to short-duration uses for the majority of the public. 

On the other hand, the augmented Metaverse will transform society, replacing phones and desktops as the central platform of our lives. This transformation will begin in 2024 when the first fully functional AR glasses hit consumer markets from major manufacturers. It will then follow an adoption curve similar to the rise of smartphones after the iPhone launch in 2007. Market penetration will be rapid, as AR will be required to access valuable layers of information.    

To explain why the augmented Metaverse will transform society and the virtual Metaverse will have limited use, I’d like to jump back to my first experience doing virtual reality research thirty years ago at NASA. I was working with early vision systems, studying how to model interocular distance (the distance between your eyes) to optimise depth perception. While this work resulted in a couple of mildly interesting academic papers, the impact of that research on my understanding of immersive technologies had nothing to do with improving depth perception in VR.

Instead, the impact came from the countless hours I had to endure writing code using a variety of early VR hardware. As someone who truly believed in the potential of virtual reality during those early days, I found the experience a bit miserable. It wasn’t because of the low fidelity as I was sure that would steadily improve, as would the size and weight of the hardware.

Also Read: Microsoft’s Activision acquisition is a bet on Gaming and Metaverse

No, I found the virtual experience unpleasant because it felt confining and claustrophobic to have hardware on my face for extended periods. And even when I used early 3D glasses (i.e. shuttering glasses for viewing 3D on flat screens), the confining experience didn’t go away. That’s because I still had to keep my gaze forward, as if wearing blinders to the real world. It made me want to pull the blinders off and allow the power of VR to be splashed all over my physical surroundings.

This sent me down a path in 1992 to develop the Virtual Fixtures platform for the U.S. Air Force, the first system that allowed users to interact with a combined reality of real and virtual objects. This was before phrases like “augmented reality” or “mixed reality” were in use, but even in those early days, as I watched users enthusiastically experience that first taste of augmented reality, I became convinced the future of computing would be a merger of the real and the virtual. 

Now, thirty years later, VR hardware is drastically cheaper, smaller, lighter and delivers much higher fidelity. The software is significantly better too, running on computers that are thousands of times faster with powerful GPUs that couldn’t have been imagined in the 1990s. And yet, the same problems I experienced thirty years ago still exist. The barrier was never fidelity – it was the deep aversion people have to feeling cut off from their surroundings.

Which is why the Metaverse, when broadly adopted by the general public, will be a merger of our real surroundings with rich layers of virtual content, delivered by see-through eyewear. This will hold true even though VR headsets will offer higher fidelity than AR glasses. But again, fidelity will not be the driver of adoption. Instead, markets will follow the technology that offers the most natural experience to the human perceptual system. And the most natural way to present digital content to the human organism is by integrating it directly into our physical surroundings.

Of course, a minimum level of fidelity is required, but what is far more important is perceptual consistency. By this I mean that all sensory signals (sight, sound, touch, and motion) are aligned to create a single mental model of the world in your brain. With augmented reality, this can be achieved with relatively low fidelity, so long as virtual elements are spatially registered to your surroundings in a convincing way. And because our sense of depth perception is relatively course, it is not hard for virtual content to appear convincingly placed in the real world. 

But for virtual reality, providing a unified sensory model of the world is harder. This might sound surprising because it’s far easier for VR hardware to provide high fidelity visuals. But that’s not the problem. The problem is your body. Unless you are using elaborate and impractical hardware, your body will be sitting or standing still while most virtual experiences involve action and motion. This inconsistency forces your brain to build and maintain two separate models of your world – one for your real surroundings and one for the virtual world that is presented in your headset.

When I say this, some people push back, forgetting that regardless of what’s in their headset, their brain still maintains a model of their body sitting on their chair, facing a particular direction in a particular room, with their feet touching the floor. Because of this perceptual conflict, you get the same uncomfortable feeling of being cut off from the world that I experienced 30 years ago. It’s fine for short periods, and there are ways to reduce the effect, but it’s only when we merge the real and virtual worlds into a single experience (i.e. foster a unified mental model) that it’s fully resolved.

Which is why augmented reality will be our gateway to the Metaverse and will eventually replace phones and desktops as our primary interface to digital content. This will make our world a magical place, unleashing amazing opportunities for artists, designers, entertainers, and educators who will embellish our world in ways that defy constraint (see Metaverse 2030 for fun examples). Of course, we also need to be vigilant to ensure our augmented future is safe, private, and free from corporate manipulation. After all, within ten years this technology will be everywhere. 

(The author is a pioneering researcher and entrepreneur in fields of augmented reality, virtual reality, and artificial intelligence. He developed the first interactive AR system for the US Air Force (Virtual Fixtures platform), founded the early VR company Immersion Corporation (1993) and founded the early AR company Outland Research (2004). He is currently CEO and Chief Scientist of Unanimous, an AI company focused on amplifying human intelligence in shared environments.)

This content was originally published here.