Metaverse Madness — A life in the internet is the next frontier

12 min readDec 26, 2021


Life, as we know, is about to be dearly disrupted. How ready are we to become the human-hybrid versions of us and embrace inhabiting the digital world with AR/VR?

Metaverse, Photo Credit — AdobeStock images.

A colony on Mars may or may not happen in our lifetime, but we are about to see a new world taking shape for us in which we may choose to inhabit, work and have fun. An upcoming new wave of the tech revolution may soon change the 2D virtual world of the internet as we know it, taking us into 3D virtual space that will exist overlaying our physical world. It’s the Metaverse, and AR/VR revolution is driving it. The idea of Metaverse is grand and unparalleled, changing our relationship with screens. It will disrupt our lives, work, the way we consume and process information, and our world itself.

So what is Metaverse, and how does AR/VR will make it a new normal?

Metaverse is simply a term for how we experience the internet in itself. It combines our physical reality with augmented, mixed, or virtual reality to create a virtual world that lies beyond, on top of, or as an extension of our physical world. The Metaverse can turn into a full-fledged social system and economy across the real and virtual world, where avatars, experiences, content, and goods can move seamlessly. It would be a real-time living experience that never pauses or ends as a video game would. In the words of Mark Zuckerberg, it is the closest feeling to teleportation one can have while you exist in multiple digital universes with other people.

How will Metaverse move out of gamming to expand into the mainstem hub of social life?

The concept of Metaverse was initially discussed in Neal Stephenson’s 1992 Sci-Fi novel Snow Crash, Ernest Client’s Ready Player One, and William Gibson’s Neuromancer. And since first imagined, the impetus to the concept came from games like Fortnite, Minecraft, or Roblox.

Presently, these Multiplayer online games have evolved into popular social platforms where people are conversing not only about the game but also socially, building strong friendships and social capital. An estimate suggests that these platforms would have about 2.9 billion players worldwide by the end of 2021, of which Minecraft has around 126 million players, Fortnite has 350 million, and Roblox has 202 million. The sheer number of players across these platforms shows us the direction of momentum in social technologies.

Evens Apps like Discord followed this socialization trend to create a space for gamers to gather and communicate better while playing. Furthermore, Epic Games’ in-game live music events have given us a sense of organizing social events in Metaverse, where people dressed up in their favorite skin to be part of concerts and interacted with celebrities like Ariana Grande and Marsh Mello. These virtual spaces even provided the real-world experience of walking around the gaming world and buying digital merchandise like you would have done at a concert in real life.

During the pandemic, more people turned to virtual gaming spaces not only for having a riveting experience of immersive worlds but also for forging newer friendships. And the level of social fulfillment provided by these spaces is beyond just liking someone’s social media posts on Facebook or Instagram.

Metaverse potential is beyond a specific game to become a platform for others to build upon and collaborate. Currently, it is difficult to have a conviction on who will lead the Metaverse race. But many companies such as Facebook, Snapchat, Google, Microsoft, and Apple are pegging success in Metaverse to success in AR/VR. They intend to replicate the success of the gaming world that has created more intimate and meaningful social interactions within small curated user groups while also replicating the dynamics of those interactions into the physical world.

Meta, the parent company of Facebook, has a grand vision of becoming a mothership of Metaverse. They want to be an integration point for Metaverse by building a standard infrastructure such as the tools, APIs, and hardware to lock-in the developer community. Facebook Reality Labs is investing an astronomical $10 billion into their Oculus VR headset, project Cambria, Facebook smartglasses, and creating developer lead platforms like Horizon and Spark AR.

However, Metaverse as a concept is not device-dependent. It can be executed even over our current screens like phones and computers, but VR/AR can provide an unprecedented experience of teleportation and virtual spaces.

How would Metaverse look like?

Imagine if the internet became visual. You will be at the atomic center of action for all the media, social, commerce, and work products. You will have an avatar, which is a living 3D representation of you with your expressions, gestures, etc. You can stylize it for different occasions like work, social gatherings, or in a fantasy shape. You can have a wardrobe full of clothes which you collect from various designers and apps. And your avatar can interact, experience, and share every digital detail, making the interactions much richer.

You can design your own virtual spaces and populate them with digital items. You can transport your photos, videos, and music from the physical world into the Metaverse. Also, You can interact with brands, experience the product demos, and make a purchase of digital goods. Digital commerce will flourish in Metaverse as the content creators, brands, and experience creators can offer in-the-presence moments.

Metaverse can become a fully functioning economy. Maintaining interoperability and seamless teleportation is the key to it. Interoperability mainly means that you can, for example, purchase a car design in one app and use it in all the apps. Although digital and virtual economies already exist, such as owning a moment from your favourite sport, there will be a possibility to trade them unrestrictedly as we have in the physical world.

How close have we come to realize the reality of Metaverse?

Metaverse offers the most immersive and frictionless experience of the digital world. Key building blocks of Metaverse to emerge as a technology envisioned in science fictional novels demand tremendous innovation and improvements on both hardware, software.

Many companies like Meta are working towards taking this experience to the next level via advanced computer neural interfaces (devices that can be controlled with your mind). Remember when Facebook bought mind-controlled wristband maker CTRL-Labs a couple of years ago.

Advancements in VR/AR technologies —

VR Hardware — the design and development of VR devices is happening at an unprecedented pace. We have already achieved significant advancements across VR Input and output devices.

VR Output devices are mainly head-mounted display sets (HMD), Haptics, and multi-sensor devices. We now have HMD in wired and wireless formats comprising

accelerometers, magnetometers, and gyroscopes and use sensor fusion to combine this information with optical tracking. The productivity of graphics chips in wireless devices has also increased significantly. The Oculus Quest, for example, has an eight-core Qualcomm Snapdragon 835 chipset and 4 gigabytes of RAM. Major tech companies such as Google, Facebook, and HTC have increased emphasis on these wireless yet powerful virtual reality headsets, owing to the significant increase in sales of these devices in 2020.

(In 2020, virtual reality device shipments were at 13.48 million units and expected to reach a volume of 112.62 million units by 2026. The top three players in VR headsets by sales are Sony, Facebook, and HTC.)

Eye tracking technology. Photo credit — FOVE demo

There is also eye-tracking technology developed by a Japanese company, FOVE, which allows foveated rendering or fully hands-free interaction. FOVE uses a combination of hardware and software to identify where the iris of the user is pointed at, promoting several features like gaze, interaction, rendering depth of field, and foveated rendering.

Advancements in Haptics and Multi-sensor devices are here to support our natural hands and legs moment in the immersive environment. However, several costs, engineering, and safety considerations are yet to be solved to add immersion and realism. For example, being able to walk during VR experiences adds realism but problems with tripping over cables or nearby furniture or the need for a large room raise considerable safety issues. Therefore, most hardware development pushes for a seated experience, where the user remains stationary. It eases tracking, avoids problems like tripping over cables or nearby furniture, and removes the need for treadmills or large rooms.

VR Input devices are mainly controllers, navigational devices like thread mill, rotational disc, gesture tracking devices, etc. The development of input controllers has seen a significant improvement. The Half Moon controllers of Oculus, also known as Oculus Touch, have advanced capacity sensors which allow for basic gesture recognition based on which finger touches the device. And, we also have Reactive Grip™ Motion Controller developed by Tactical Haptics. In this, touch feedback is used to convey motion and force information using tactile feedback integrated into the handle of a device. The device mimics the shear and friction forces one would feel if they were interacting with real objects, recreating the skin sensations of actually holding an object in the virtual space.

The prototype glove from Meta Reality Labs for a truly immersive VR experiences. Credit — FACEBOOK REALITY LABS

Meta’s prototype haptic glove is also aimed at creating an actual sense of touch to the user. It uses principles from soft robotics and employs pneumatic and electroactive actuators to quickly inflate tiny air pockets on the fingers and palm of the glove. These actuators are essentially tiny motors that can create the sensation of pressure and, hence, touch. To control these actuators, they’re building the world’s first high-speed microfluidic processor — a tiny microfluidic chip on the glove that controls the airflow that moves the actuators by telling the valves when and how far to open and close.

The idea here is that if Meta could fit thousands of these actuators onto a haptic glove and combine those sensations with the visual input of a VR headset or augmented reality glasses, which project digital images onto the real world, the wearer could reach out and feel virtual objects. With gloves like these, you might one day shake the hand of someone else’s avatar in the metaverse and feel the squeeze.

TESLASUIT measuring biometrics. Photo Credit-

Several companies are attempting haptic body suits like the Tesla suit. This suit is designed to capture anything from biometric data to galvanic skin responses and measures the changes in electrical resistance of the skin, thus providing users with sensation and a sense of touch in virtual and augmented reality.

Apple is also amongst the forerunner in AR/VR tech development. It has filed several patents to detect and capture micro gestures either optically, via biofeedback, or with wrist, fingers, etc. Apple is set to soon launch a VR device, which is rumored to have four sets of advanced 3D sensors capable of gesture recognition and object tracking.

AR Technology –

The basic AR is already available to a broad set of audiences via smartphones that are equipped with true-depth sensors that create realistic physical 3D meshes/depth maps of environments and use them to lay virtual objects. However, for AR hardware to function as we see in the Star Wars movies, requires complex advancements in hardware and software technologies. It is like fitting a supercomputer into a pair of normal-looking glasses. The challenge in AR begins right from the product design, requiring major technology breakthroughs in display technology, audio, inputs, haptics, hand tracking, eye tracking, mixed reality, sensors, graphics, computer vision, avatars, perceptual science, AI, and more. And achieve the optimal form factor, which can be adoptable by millions of consumers without causing any friction or habit deviation.

Google’s smartglasses gave us an early peek at AR, but understandably, it failed because the technology didn’t evolve to provide the augmented reality experience the company envisioned in its initial marketing. Instead, it was an extra SmartScreen, displaying notifications, weather, and other relevant information. Similarly, Facebook in collaboration with Ray-Ban also dived into smartglasses as an attempt to gain a step closer to augmented reality, but it has little to be awed about.

HoloLense 2, Credit —

Comparatively, HoloLens 2 by Microsoft is a mixed-reality headset that is currently showing great promise for enterprise customers. HoloLens 2 is capable of instinctual interaction by processing hands gestures, eye tracking and also taking voice commands. It projects holograms using laser-based display technology into the field of view and can detect real-physical objects in the room using an array of sensors called Azure Kinect. Furthermore, Microsoft’s Azure Spatial Anchors let multiple devices see and interact with the same virtual object. Despite plenty of hype, HoloLens 2 is far from perfect yet and is not open for consumers or the developer community.

Along this similar lines, Facebook Reality Labs’s researchers Andrew Maimone and Junren Wang released a paper titled “Holographic Optics for Thin and Lightweight Virtual Reality” in 2020. This paper outlines how a new class of near-eye displays with the power of holographic optics and polarization-based optical folding could be used to create ultra-lightweight VR headsets. While still in the prototype phase, the research suggests a possible approach to VR/ mixed-reality that almost eliminates the need for bulky hardware and would have the ability to project immersive imagery through thin, lightweight lenses.

Game Engines –

Epic Games/Unreal Engine

Unity and Epics games’ Unreal Engine are the two platforms pouring hundreds of millions of dollars into a massive bet, to be an underlying platform for Metaverse. Unity dominates the AR and VR content with a market share of about 60%, while Unreal Engine’s share lies between 10 to 20%. However, with the launch of Unreal Engine 5, this equation is about to be changed. Unreal Engine 5 has unveiled a most advanced 3D creation platform that has taken graphics creation in virtual worlds to the next level. Epic Games’ Unreal Engine 5 is stunning with its groundbreaking new features such as Nanite and Lumen that provide a generational leap in visual fidelity to create realistic and expansive worlds with scalable content. The best part of this engine is the incredibly low system memory this program requires to run, making it perfect for any devices. Furthermore, the engine accelerates 3D creations, makes iterations a breeze, and further simplifies collaboration. Unreal Engine 5 offers full access to source code, a robust C++ API, and Blueprint visual scripting for developers, kicking off the possibility for developing expansive use cases in virtual space.

Software platforms —

As with Apple’s ARKit and Google’s ARCore SDKs for mobile AR app creation, a suite of companies encourages open participation from the community to explore the various possible use cases irrespective of the underlying critical hardware structure. Companies including Oculus, HTC, Google provides SDKs for respective and many popular development environments. These SDKs provide native APIs for key VR features like user input, controller support, and rendering, which you can use to build an immersive VR experience.

Use cases of Multiverse

In Free Guy, sunglasses are the key to seeing the movie’s video game world. Credit - 20TH CENTURY STUDIOS

Like in the movie Free Guy (an impressive one), Metaverse could offer us an alternate universe where we can play, interact, and maybe one day work out of it. The biggest USP of Metaverse is that it will eliminate the need for physical proximity, leading to a myriad of possibilities that goes far beyond gaming. However, first, solid use cases for Metaverse will emerge from businesses to create a digital twin of the product and processes before it can be commercialized for social applications.

For example — in healthcare, its applications could range from doctors virtually assisting colleagues in performing complex operations to simple primary care drug administration. Even companies like Boeing are planning to build their next airplane in Metaverse.

Metaverse can power academics by creating immersive and practical understating across streams of science, engineering, and arts. The retail sector’s opportunity to use visual internet for digital launches and catalog displays can create a personalized experience for its digital consumers. Eventual consumer adoption of Metaverse will develop new virtual economies with new ownership models and entitlements for digital goods.

In my view

Life in Metaverse is certain due to the social currency and tribalism it can create and AI becoming more intuitive. However, the technology is at least a few years away for it to be well within reach for business and consumer applications to take off in a meaningful capacity and evolve into a much more powerful version of itself. Furthermore, an ecosystem has to be built with new and stronger governance, privacy, and safety features to ensure its users and user-generated data are protected.

And most importantly, the fabric of social structure and norms of social interactions could be different in Metaverse. Suttle micro observation we usually take a note in our everyday life, say to understand if a friend is sad, may not be very seamless leading to more isolated societal structure. However, the verdict is not out yet, we have to wait and see if these advancements in technology would unite us socially or bring in more isolation and polarization.

References —




Business Analyst and Startup enthusiast focused on Artificial Intelligence and Deep Learning