The Trust Project is a worldwide group of news organizations working to establish transparency standards.
To improve your local-language experience, sometimes we employ an auto-translation plugin. Please note auto-translation may not be accurate, so read original article for precise information.
NVIDIA is doing intriguing work in the extended reality (XR) space, and the most recent example, demonstrated in the Omniverse XR video above, is something the chipmaker says is “the first full-fidelity, fully ray-traced VR.” Ray tracing is a processor-intensive way to recreate the way light might act in the real world and is a potent tool for enhancing any kind of immersive virtual reality experience.
The company describes Omniverse XR as a beta-stage “immersive spatial computing app that enables you to interactively assemble, light, and navigate Omniverse scenes in real-time, individually or collaboratively with a team.” It’s currently supported for use with the Oculus Touch and HTC Vive.
While rendering light in real-time is a fascinating experiment pointing the way toward the future of using XR, there are some obstacles standing between NVIDIA‘s experiment and this becoming a commonplace feature of Web3 apps. The biggest challenge is the necessary computing power. The company has indicated optimum performance requires two RTX 3090 cards. Even though Omniverse XR has incorporated static foveated rendering to improve performance, it only renders whatever is in the center of the lens in full resolution.
NVIDIA says this is a totally “unique approach” to establishing an XR environment in real time and the chances are good it may one day be standard. But we’ll have to wait until commercially-available computing power is up to the task of creating photorealistic spaces, and for average consumers, we’re not there yet.
Read related posts:
Any data, text, or other content on this page is provided as general market information and not as investment advice. Past performance is not necessarily an indicator of future results.