Commissioned: Not everybody wants to be Iron Man. But there isn’t an artist or engineer around that doesn’t envy the 3D, immersive design experience that Tony Stark used to invent new equipment and analyze his existing gadgets in the Avenger movies.
We have been imagining an immersive experience combining the physical and virtual, digital worlds for as long as there has been computing with graphics. The recent convergence of real-time rendering with photo-realistic ray tracing, assisted by artificial intelligence (AI) in multiple layers of the hardware and software stack has made a definite difference.
Combine that with GPU-boosted HPC simulation, modeling, and visualization, and the proliferation of graphical computing and rendering in the cloud, it now becomes possible to create a digital twin of the existing world that is compliant with the laws of physics and shown to be. Or, even imagine new worlds where the laws of physics are actually completely different. . . .
Age Of The Metaverse
Transforming the Internet from a 2D screen experience to a 3D immersive experience – what we now call the metaverse – is going to take a tremendous amount of investment. Every aspect of business and our personal lives will eventually be transformed by it, much as they have been transformed by the Internet over the last 25 years. And knowing historical trends, the metaverse is likely to take a third of the time to reach the kind of practical fruition and commercialization that the Internet did.
It its recent report – Value creation in the Metaverse – McKinsey & Company concluded that the metaverse is already “too big for companies to ignore” with 95 percent of business leaders expecting it to have a positive impact on their industry within five to ten years. The management consultant company also predicted that total spending on the platforms, services and processes expected to underpin the metaverse’s creation will reach $5 trillion by 2030.
“We have entered the decade of the metaverse,” explains Gavin Wang, senior AI product manager at Inspur Information. “Today, the first metaverse markets and experimental applications are launching. As more industries, businesses and more people become active participants, a treasure trove of data will become available, giving key insights into how the metaverse is evolving and where it is headed, with new businesses and applications developing to match those trends. As 2030 approaches, a mature metaverse will begin to take form with the robust applications, and foundational IT infrastructure backbone to support this new, ubiquitous digital world. It will impact everything.”
More and more industries will be become integrated with the metaverse, believes Wang. That means everything from games, film and television, design and advertising to virtual interaction, digital humans and smart cities. The retail, banking, manufacturing, distribution, academic and scientific research, and healthcare industries will all likely make use of digital twins – simulated portions of the real world that can be tweaked – to optimize the real world. And many believe that we will use increasingly realistic virtual avatars to immerse ourselves into these digital twins to have shared experiences.
All of which suggests the metaverse is going to be big business. Sales of rendering and simulation software alone are expected to reach almost $31 billion by 2025, and Inspur expects that we will get our first experience of the metaverse as virtual avatars, which will require the integration of image, speech generation, animation generation, audio and video synthesis, and interaction with reality. By 2030, the virtual human market size – including all of the hardware, software, and services needed to power the metaverse – is projected to reach $527 billion worldwide. Suffice it to say, the metaverse will create a huge and growing demand for systems that can support rendering, visualization, simulation, and AI inference all at the same time, all on a distributed global network like the Internet.
The metaverse will be the future Internet, in fact. Which will leave web browsing and email as we know it today looking as dated and archaic as a mainframe green screen applications in the world of the smartphone.
Forging The MetaEngine
While there is consensus about what the metaverse will require, there isn’t yet a de facto standard that defines the software stack needed to support it.
Meta, formerly known as Facebook and known for its vast social network software stack, is spending tens of billions of dollars creating the hardware and software to run its own variant of the metaverse. Elsewhere Nvidia, a pioneer in AI training and inference processing on GPUs and an innovator in graphics and rendering, has created its own Omniverse Enterprise software stack, which it licenses to run on its own DGX and HGX systems as well as on third party machines like Inspur’s MetaEngine. The Omniverse stack is currently optimized to run atop the VMware ESXi and vSphere server virtualization stack, which is the most popular packaging and deployment tool in the enterprise.
“MetaEngine currently supports Omniverse Enterprise, but does not rely solely on the Omniverse software stack to build the metaverse platform,” says Wang. “Down the road, if there is a different metaverse platform that meets the needs of customers and the market, MetaEngine would be actively adapted to support this software.”
That is why the MetaEngine platform is more likely to be a flexible line of machines, not just a single machine, as the metaverse grows and evolves, according to Wang. In the long run, MetaEngine machines will be created for datacenter and edge scenarios where digital twin infrastructure will be running. And they won’t be run solely by the big hyperscalers and cloud builders that dominate the Internet today, either.
You can see a video of a simulated digital factory powered by Inspur Information’s MetaEngine below:
The metaverse will need tens of millions, and perhaps even hundreds of millions, of machines to be fully realized. And it will require a large distribution of compute and graphics suppliers, much as the early Internet did with many thousands of Internet service providers.
“At present, we see not only the traditional hyperscalers and big cloud providers building out metaverse infrastructure,” Wang says. “Telecom operators are actively working on large-scale computing power and networks to host portions of the metaverse, and cloud computing architecture is bifurcating into clouds, edges, and end points, all integrated and working in harmony”.
“Using the public cloud as a platform to run parts of the metaverse has the advantages of low cost and good scalability, but private cloud infrastructure provides better guarantees for data, security, and service quality than the public cloud,” Wang continues. “We think that given this, public cloud and private cloud metaverse serving will coexist for a long time.”
Underneath The Metaverse Hood
Inspur’s MetaEngine is designed to support both public and private cloud deployment models for the metaverse, and will be preconfigured with metaverse software to take on different aspects of digital twinning and immersive virtual avatar experiences. The initial MetaEngine machine is certified to the Nvidia OVX server specifications, and will come with worldwide sales and support from Inspur Information.
The MetaEngine machine comes in a 4U rack-mounted chassis and has a host motherboard with a pair of Intel “Ice Lake” Xeon SP-8362 Platinum processors with 1 TB of main memory and 16 TB of NVM-Express flash storage. Perhaps more importantly, it also includes eight Nvidia A40 GPU accelerator cards, which run the rendering, graphics, and AI inference workloads which will play such a crucial role in creating the metaverse.
And because the 3D metaverse is a massively distributed application – just like the 2D Internet has been – the MetaEngine includes lot of networking so that workloads can be distributed across a cluster of machines. Specifically, the MetaEngine system comes preconfigured with three Nvidia ConnectX-6dx network interface cards, which each have a pair of 100Gb/sec ports that when combined provide 600Gb/sec of aggregate bandwidth. The A40 GPUs and the ConnectX-6dx NICs are linked to each other by a pair of PCI-Express 4.0 switches embedded on the board, each one linked to one of the Ice Lake CPUs.
In this way, each quad of A40 GPU accelerators has its own network interface, allowing for GPUs to talk to each other over an RDMA network (either InfiniBand or Ethernet) without having to go through the CPU hosts.
Inspur Information is working to tailor the metaverse solutions for specific markets. Take virtual avatar creation as an example. To make one, you start with the video and audio of an actual person on camera. Virtual humans are then created using natural language models, the audio output which model can be integrated with the Omiverse Avatar stack to create and render the virtual avatar in real time based on the real human inputs. Different language models with different language inputs and outputs can be swapped into this metaverse stack, all supported by MetaEngine.
The metaverse is coming, it’s just a question of when individual organizations want to jump on board.
Commissioned by Inspur.
Sign up to our Newsletter
Featuring highlights, analysis, and stories from the week directly from us to your inbox with nothing in between.
Be the first to comment