Next year, it won’t exist. In 2026, it will no longer exist. Even if the technology doesn’t exist by 2032, it’s likely that we’ll have some ideas about how to design and manufacture chips that make Mark Zuckerberg’s fever dreams a reality.
During the past six months, there has been a schism between the way corporate America is describing and achieving a metaverse based on the nature of the computing power required to do so. As with the decades-long effort to reduce personal computers to the size of an iPhone, getting there will necessitate significant innovation.
Mark Zuckerberg’s metaverse
Activision Blizzard’s $68.7 billion acquisition bid by Microsoft was billed last month as a “metaverse play” by the company. In October, Facebook underwent a complete rebranding to reflect the metaverse as its central focus. To “allow storytelling without boundaries,” Disney promised to build its own metaverse last year.
To implement these ideas, we must be able to produce the chips needed to power the data centres and networking equipment. And we can’t do it right now. The future of semiconductor devices is a mystery. No one knows how or where to begin, or even if they will still be semiconductors. There aren’t enough chips to build all the things that people want today, much less the metaverse preachers’ promises.
In order to deliver [a metaverse] type of experience, Jerry Heinz, the former head of Nvidia’s Enterprise Cloud unit, said that the biggest things that we are looking at in supercomputers today still need to be improved.
The metaverse as we know it today has its roots at least as far back as early twentieth-century science fiction.
For example, in “The Machine Stops,” by E.M. Forster in 1909, a pre-chip, pre-digital metaverse is depicted. After 70 years, science-fiction writers William Gibson, Neal Stephenson, and Ernest Cline all used the term “cyberspace” to describe this concept in their works, while Ernest Cline referred to it as OASIS in his novel “Ready Player One.” They don’t all depict a utopian society.
It’s possible that the metaverse, as we’ve come to know it, will always be confined to the pages of fantasy novels. Be that as it may, Mark Zuckerberg has propelled the concept into popular consciousness.
It’s unclear what Zuckerberg’s vision of the metaverse will look like in the end, but he does include some of the commonalities among its supporters:
It’s a “embodied internet that you’re inside of rather than just looking at,” he said, offering everything you can currently do online as well as “some things that don’t make sense on the internet today, like dancing.”
The metaverse is as nebulous as its name suggests. In the future, it could be used to describe a wide range of technological developments. And it’s possible that early versions of the metaverse may already exist in video game form.
Metaverse Will Require Computing Tech
MILLIONS of people watch live concerts online on Roblox and Epic Games’ Fortnite, albeit in virtually separate groups of only a few hundred people. With the help of flight and weather data, Flight Simulator has created a 2.5 petabyte virtual world that can be updated in real time.
Only a fraction of what we need to create a persistent virtual world that can be used by billions at once, across multiple devices, screen formats, and in virtual and/or augmented reality is currently available in the market.
For a truly mass market, time-consuming activity, “generations of compute” are needed, Creative Strategies CEO Ben Bajarin tells Protocol. “We’re looking” at that, he says. AR and VR will have a greater role to play in the future than they do now, with an emphasis on the former. “However, it won’t be a 3D simulation,” you ask.
A change in the generation
Chips were the initial power source for mainframes. Servers, personal computers, and smartphones all grew out of the larger, more complex mainframes that came before them.
If the metaverse comes next, no one will be able to accurately describe the system requirements because it will be so different from previous alterations in computing technology. However, it has become clear that chips of nearly every kind will have to be an order of magnitude more powerful than they are today in order to achieve anything close to the optimistic version.
“Truly persistent and immersive computing, at scale and accessible by billions of humans in real time, will require even more,” wrote Raja Koduri of Intel in a recent editorial. “A 1,000-fold increase in computational efficiency from today’s state of the art will be required.”
No one can overstate the difficulty of achieving the one-thousand-fold increase in computing efficiency. Koduri’s estimate may be conservative, and the demands could easily exceed ten times that amount. ”
Professor Pedro Domingos of computer science at the University of Washington told Protocol that even if the onerous hardware requirements are met, better communication between all layers of the software stack — from chips at the bottom to end-user applications at the top — will be needed..
Now, “we can get away with [inefficiency],” he said. But in the metaverse, “we can’t.” Artificial intelligence (AI) and graphics (of course) are already being integrated into the entire [software] stack.
In this case, it isn’t quantum computation.
Perhaps not quantum computing, or at least not in the form we know it today: a theoretical platform decades away from practical use that necessitates calculations to be performed at outer-space vacuum temperatures in room-sized computers. However, a performance boost in the form of quantum computing is required.
Algorithms could help Google move the needle by designing more powerful chips. AI models already have dedicated processors, but Domingos believes that by developing more specialised chips, it will be possible to squeeze out even more performance. For example, an application-specific integrated circuit that performs physics calculations can be created using these designs to bypass the limitations of current silicon technology.
“These companies—chip-makers, or the providers of the metaverse, or who knows—will make more and more advanced chips for this purpose,” Domingos predicted. “There are things you can do at every level of the stack, from the physics to the software.
Domingos noted that, in the 1990s, ray tracing in real time would have been considered impossible, yet decades later it’s now done in real time with chips that power the PlayStation 5 and Xbox Series X. One more example of an increasingly common type of chip that will be required in the metaverse is Google’s AI chips, known as tensor processing units.
Spectacular future ahead of us
However, as computing evolves through the generations, so must manufacturing technology. Extreme ultraviolet lithography machines are already being used by companies like TSMC and Intel to print the most advanced chips.
For decades, transistors and other features have been squeezed into ever-tinier chips, and the latest EUV machines are no exception. However, chip-making machines will either become prohibitively expensive or it will be impossible to further reduce the size of the features.
We’re “pretty close” to necessitating a breakthrough, Bajarin said. “I don’t want to say that we need a breakthrough, but we’re pretty close,” he continued. According to him, “sub-one-nanometer technology is four to five years away and will not solve this problem.”
A lower-quality version of the Zuckerverse is possible without a technological leap forward. For example, a persistent, internet-connected virtual world should be feasible if users accept graphics that are marginally better than what Second Life was capable of achieving a decade ago. Creating that metaverse will necessitate more advanced networking, the specialised chips Domingos described, and perhaps even artificial intelligence computing to handle some of the more complex but mundane workloads that will be encountered.
As a result, today’s data centres will appear “miniature” compared to the ones of the future, Domingos said.
There is a long road ahead of us, but we’ll get there one day. Zuckerberg’s vision of the metaverse could be decades away, and after losing $20 billion on the effort so far, it’s not clear Meta will have the cash to turn that vision into reality.