Varjo today announced its new Varjo Reality Cloud platform, a device-agnostic VR meeting platform which can scan and share physical spaces in real-time using the depth sensors on its latest XR-3 headsets. The company doesn’t have a specific release date for the tech just yet, but is presenting a glimpse of what it plans to deliver in the future. Road to VR got an early look at the new tech.

Varjo Reality Cloud

Varjo is building a virtual meeting platform not unlike we’ve seen before, but with one key difference: the company plans to leverage the wide field-of-view depth sensors on its Varjo XR-3 headset to allow users to easily capture their surroundings and use it as the basis for virtual meetings. What’s more, beyond just a static capture, the Varjo Reality Cloud platform will continuously update the portion of the environment that’s in the headset’s view while the meeting is happening. That means if there’s something real and relevant in the host’s environment—like a book, product, or even another person—the virtual viewers will be able to see that thing moving and updating in real-time (as long as the host is actively looking at it).

The idea melds well with XR-3’s existing high-quality passthrough capabilities. During normal use of the headset it’s easy to toggle on the headset’s pass-through view to see the environment around you, which means if you were sharing your local environment through the Varjo Reality Cloud, it could seem like others in the meeting were standing right in the same room as you.

Varjo thus likens its Reality Cloud platform to ‘teleportation’, though I wouldn’t say it goes that far just yet.

Hands-on With the Prototype

I got to see an early prototype of the Varjo Reality Cloud in action during a meeting with the company in Silicon Valley. Using the Varjo XR-3 headset, I was shown a pre-recorded example of a Varjo Reality Cloud meeting space with a person standing in the center of the room talking, gesturing, and showing me some objects from around the room. While most of the room around me was static, the person was essentially being ‘filmed’ by an XR-3 headset, which meant their movements (and anything in a certain area around them) were being updated in real-time.

To be clear, the environment I was seeing wasn’t just flat or even 180 footage, it was an actual volumetric space, and so was the person that was standing inside the room. And while I could definitely make out the specific person I was looking at and the room around me, in this prototype phase the fidelity leaves a lot to be desired. The room scan and the person in front of me were assembled from a splotchy point-cloud of colored dots—far from the incredible quality of several of Varjo’s photogrammetry demos that I’ve seen in the past.

While it’s almost certain that the Varjo Reality Cloud won’t look as good as careful pre-captured photogrammetry any time soon, the company says that what I was looking at is merely a proof of concept, and that improvements in fidelity are expected as they move forward with development.

One important part of that ongoing development will be moving the whole thing into the cloud. While the demo I saw was a pre-recorded example of the Varjo Reality Cloud, ultimately the company plans to stream the captured environments from the cloud to any participants in the room, leaving the bulk of the computing to be done in the cloud. To do so at the highest possible quality on its ultra-high resolution headsets, the company says it has developed a foveated compression algorithm to cut the stream down to just “single megabytes per second.” My understanding is that that algorithm specifically takes advantage of the eye-tracking that’s built into Varjo headsets.

Device Agnostic

But Varjo headsets aren’t the only devices that will be able to join the Varjo Reality Cloud. While it’ll take an XR-3—with its equipped depth-sensors—to capture and stream the environments, the company says that it’s taking a device-agnostic approach to participants. The company expects that participants joining Varjo Reality Cloud sessions could be on computers, smartphones, tablets, and other VR headsets too.

Continued on Page 2: Cool, but Revolutionary? »

Source link

Was this article helpful?
I love communication in all its aspects. I like to share my experiences, explorations, and knowledge with the Second Life community. I created the VIRTUALITY blog and 360 GRADI Magazine with this goal in mind.
Previous post VR Education Platform ‘ENGAGE’ Raises $10.7M to Build ‘Oasis’ Metaverse for Business – Road to VR
Next post work in progress …

Leave a Reply

Your email address will not be published. Required fields are marked *