If you haven’t attended Microsoft Envision UK recently you might have this great video how you can utilize HoloLens 2 for Holoportation, Digital Twins and with more photorealistic avatas (well, at least the top portrait of them) in an easy way.
In the video Alex Karim, Metaverse Lead at Microsoft UK, discusses and demos with Mark from Fracture Reality what is possible with HoloLens 2 already today regarding the post topic. I was impressed by various aspects of the video, but what made me share the video here in my blog is how you can use HoloLens 2 with Azure Remote Rendering to create and engage with content that really stands out – including holograms of people (today) and live Holoportation (upcoming). Also the avatar creation from the photo looked pretty cool way to bring in a bit of holorealism to meetings.
JoinXR Platform, developed by Fracture Reality, seems to be filled already with interesting features that can enable Metaverse and futuristic meetings today. While this doesn’t seem to be built on Microsoft Mesh, but being a separate platform, it is something that could be utlized to make remote meetings a very different than what they are in 2D meetings. This is engineered towards requirements that include 3D models but I can see that this kind of meeting would be very effective as an Executive board space and training/learning scenarios – including rich presentatios, data and video shown alongside 3D objects.
No, I don’t have any connection to Fracture Reality and I haven’t tried JoinXR. Yet. But I was very impressed by the Envision demo and video so I wanted to share this to my readers.
Especially the rendering of Alex’s Digital Twin (Hologram recording) using Azure Remote Rendering was very awesome. I wish I could just grab a HoloLens 2 and check it out myself from the view Alex and Mark saw that.
Microsoft Announced on 8th of June 2022 that at Mixed Reality Dev Days “the public preview of point cloud file support in Azure Remote Rendering! Remote Rendering enables developers to render high-quality interactive 3D content and stream it to devices like HoloLens 2 in real time. Today, Remote Rendering is great for rendering detailed meshes from FBX and GLTF/GLB files. In the past you would need to convert to these formats to use your models with the service. Now with point cloud file support, you can take raw data from your favorite lidar scanner and visualize places and spaces quickly and easily. We have added this feature in a way that does not change how you use the service. Point cloud capabilities have been one of the top asks from our partners and developers, so we are excited to bring this to public preview today! We have also made numerous updates to the service to improve performance and usability. “
In more tech specs, Remote Rendering now supports .e57, .ply, and .xyz point cloud file types directly to .arrAsset file. This allows the quick use of raw lidar scan data as content with Remote Rendering. Premium Remote Rendering supports up to 2.5 billion points of data in point cloud files.
Sounds to me that the recorded Hologram (Holoported) Alex was created with this support. Read more from Microsoft’s announcement at Tech Community Blog Article.