Next-Generation Technologies Shaping Reality’s Future Into the Metaverse:
To ensure that the metaverse is thriving it is essential that a new set of technologies have to be invested in and further developed. But , as Mark Zuckerberg said in his video on Facebook’s changing its name to Meta it is a process that could be a long time coming.
Yet, despite the prospect of a fully-fledged metaverse not yet fully accessible in our present reality, Gartner’s annual report on tech hype noted that the mixed reality (MR) has reached a point of maximum productivity that is becoming more widely used by users due to the numerous scenarios that are in the market. In addition the fact that like Allied Market Research stated the MR market is predicted to grow to $454.73 billion by 2030.
So, what exactly is the metaverse and what are the trends in technology that will help it move from concept to reality?
What is the Metaverse?
It is a digital-reality area where users can interact with a computer-generated world as well as other users. According to Mark Zuckerberg’s keynote address, the metaverse is all focused on ” experiences.” Meta’s view of”the metaverse” will be “a new phase of interconnected virtual experiences using technologies like virtual and augmented reality.”
To know what’s currently taking place in this area we can examine the major companies such as Apple, Microsoft, and Meta are doing , and also look at the subjects being discussed at the major industry conferences such as The AWE as well as ISMAR. Based on this analysis we’ve identified six future-generation technologies.
When we refer to the concept of extended reality (MR/VR/AR) We are talking about spatial information. This includes 3D models as well as audio and other forms of media that can be spatialized. To achieve this the first step is to create actual 3D content.
We’ve looked at the capabilities of an iPad’s LIDAR along with its Hololens 3D Kinect Sensors, and now we can also examine the metrology industry to find out the latest technology in 3D reconstruction. Companies such as Zeiss, Nikon, and Faro have 3D reconstruction devices which are accurate to units that are even less than millimeters. These devices aren’t expensive in any way. There are however the consumer version of this technology, like Apple’s LIDAR on its mobile devices, and Matterport, a Matterport 3D camera which isn’t as exact as Zeiss or Faro’s offerings but is a good price for a user. Microsoft’s Azure Kinect, that’s more of the IoT approach. Additionally, Canon as well as other manufacturers are releasing specific lenses to support VR content.
Not just is 3D reconstruction vital however, we need to consider how we manage all the data to enable users to access it. Imagine it as if Sketchfab allows us to build an immense user-generated 3D content library that can be shared seamlessly on social media sites such as Facebook. We need a system that allows users of the metaverse to create and access content from the top of our real world or within our virtual worlds, from anywhere and across different devices dependent on the environment.
It is also important to think about every MR device and the way it perceives the world. We’ve seen this with devices such as HoloLens, Magic Leap, and HTC Vive Pro, we can experience a 3D real-time reconstruction of our world, permitting virtual content of the digital world to communicate with the space in real-time. For instance, if you move a furniture piece within the actual world then the 3D content would react and extend our experience.
These sensors are currently able to accommodate what they can handle. The SoC (system on chip) in the device is able to handle however, if we look at the degree to which the reconstruction is crucial to the interactions between real and digital, it is possible to imagine how this could work in the future, as more powerful silicon becomes made available in XR devices. The degree of immersion can be amazing!
We’ve now discussed 3D reconstruction, let’s consider the next major XR integration one, which is AI. Let’s look at an analogy. There are neuronal pathways within our brains that process spatial experience; The first one can be described as that of the Dorsal Pathway which identifies relationships with movement and space. Within our XR systems this is done through 3D reconstruction systems as well as algorithms, such as SLAM and visual-inertial Odometry. Then there is The Ventral Pathway, which oversees the identification and classification of objects, providing an understanding of what’s going on in the world around us. Within our XR systems this is done through AI. Together with 3D reconstruction methods, AI allows us to identify our surroundings and provide detailed and context-rich information for the right situations.
This isn’t all. We also can highlight empathy AI, which can enhance our experience and understanding of how we communicate and connect with the other. For instance the system could send us alerts when we fall asleep while driving or are performing an unsafe task, something that companies such as Affectiva are working to achieve. This can help us move closer to context-aware systems which can provide real-time, important information to metaverse’s users.
Internet of Everything (IoE)
As AI is required to provide the right information in a context, we also need information to feed these contextually aware systems and enable them to function. In order to do this, connection between devices is essential.
Think of it as how smart devices enable you to control your preferred digital voice assistants, e.g., Alexa or Google assistant. Let’s take a step further from the Internet of things (IoT) to the IoE. The IoE can provide our XR system and metaverse systems with information as well as interactions valuable to the individual user.
As shown in the Meta announcement video, we were shown the recreation of an apartment where users were able to turn on the TV simply by watching it and doing a gesture. The IoE could allow this to be an absolute reality that allows us to control additional systems that are not part of the XR-based system and eliminate the limitations of voice commands through the virtual assistant that we have.
A brand new level of immersion could also have a significant role for digital humans, the non-playable characters (non-playable figures) as well as virtual assistants in the metaverse. Based on Gartner’s 2021 technology hype cycle report, this technology will be in demand in the next 10 years. The best scenario is the XR.
Presently, there are companies such as UNEEQ committed to this kind of technology. It will be a huge leap in technology from what we are using to offer with Alexa or Cortana as abstracts that lack a face and a more human digital assistant that will serve as our personal assistant and give us data that is contextually rich, which will result in a more pleasant interaction for users. (This final statement is based on the idea that the technology will surpass what we’ve seen in the Turing Test and will be clearly AI to stay clear of dystopian notions where we’re not sure whether we’re talking to an actual human as opposed to the AI.)
The Actual Metaverse
In nearly each AWE-related conference, Ori Inbar, the CEO Super Ventures, CEO Super Ventures, has mentioned the AR Cloud and how it can allow us to become completely spatial. The CEO of Super Ventures says in the CEO’s remarks, “AR researchers and industry insiders have long envisioned that at some point in the future, the real-time 3D (or Spatial) map of the world, the AR Cloud, will be the single most important software infrastructure in computing, far more valuable than Facebook’s Social Graph or Google’s page rank index”.
Recently at AWE 2021, the AWE 2021 Asia edition, Alvin Wang Graylin, President of HTC China, highlighted some potentials and issues in the world of metaphysics.
The president also presented six metaverse laws that are worth noting:
Only one metaverse exists.
There is no one or entity that owns the metaverse.
The metaverse is wide and accessible to everyone.
The metaverse is device independent.
Every user is able to make decisions and can have an impact on the metaverse.
Metaverse will be the next stage of the internet.
The Neural interfaces that are being developed currently will serve as the enablers of XR as well as other gadgets. But, there’s an extensive way to go because they connect neural pathways to chips — a task that’s more difficult to do than said.
For instance, Neuralink is currently in the R&D stage, and Next-mind was launched in the year 2000 which allows you to control devices with the noninvasive EEG technology. There’s still much research to be done in this field before it is able to be used in the marketplace. As demonstrated in this Ted Talk it is possible to imagine the future in which we can manage the outputs and inputs through our nervous systems that opens the door to new communication interfaces.
Make Your Move to the Metaverse:
We at BriskLogic, We’re no strangers to the world of metaphysics. Our teams have participated in discussions in and projects about VR, AR and XR, and we’re always ready for the next task!
BriskLogic with Fortune 500 brands and growing startups to develop end-to-end digital products that incorporate innovative methods to stay ahead of the pack. If you’re looking to discover the ways your company can benefit from our metaworld of possibilities, we’d like to talk with you.