Microsoft released a news article related to Microsoft Mesh on 16th of January 2023. While they didn’t reveal any timeline they did share interesting information about Microsoft Mesh use and also on it’s capabilities. Working together Microsoft, World Economic Forum (Forum) and Accenture have built an early adopter experience for the World Economic Forum Annual Meeting in Davos, Switzerland.
In the Mesh experience attendees were able to collaborate on large challenges facing our world and were able to experience and see “like standing there” what those challenges, while allowing them to talk about what they are sharing together. They had the presence in the shared immersive experience built with Mesh. This important for the future – you can create a forum of people from anywhere in the world where they can join together using any device: virtual reality headset, PC or mobile device. Everyone is equal in the experience and can have a voice and engage in the feedback & ideation process. This is one of reasons why I see so much potential for the Metaverse.
What was very promising, is that Microsoft wanted to have feedback and experiences from people at this very early adopter phase so they can “improve the product ahead of bringing Mesh capabilities to a broader set of customers and partners.“. To me this means that Microsoft is getting ready for more wide Microsoft Mesh Private Preview.
Reading through the news article I was able to dig out several interesting details
- Microsoft Mesh custom spaces (or Mesh platform) is clearly more ahead than I feared. Mesh enables to create these virtual spaces (as seen in news Mesh pictures) where people can interact with each other in the immersive virtual world
- Engage with eye contact, facial expressions and gestures. I hope this means that Mesh for Teams in Meta Quest will support Meta Quest Pro eye and facial expressions and also hand tracking. Using Meta Quest Pro or HoloLens 2 with hand tracking is a really powerful tool to enable presence and expressions to the audience in the Metaverse Virtual Spaces. I have found out that I feel much more natural when I can use hands to point at specific areas or use thumbs up, rock or clapping, for example, without having to worry about the controller or learn something new. Of course hand tracking has limitations as it only works where headset cameras can see, but that is the balance that can be learned. In short: facial expressions, eye and hand tracking combined together can really boost your presence and natural expressions in the virtual space. I am sure Microsoft Mesh will have support for these, at least eventually.
- Spatial audio is included. Along with presence features and spatial audio these will create better immersion. To be frank, I would have been stunned if spatial audio wouldn’t have been there – it is needed for any environment that wants to be considered serious and immersive
- Microsoft Teams integration will be strong with Mesh. Just like announced earlier. I suspect we will first see Mesh for Teams and later can build our custom virtual spaces with Mesh Platform.
- You can collaborate with Microsoft 365 Application. Whiteboard and PowerPoints were already announced earlier, but this news article also mentioned the ability to use content stored in OneDrive. This can mean for example 3D models and objects you could bring in to the space, videos, pdf-documents and images. I also hope it can include Power BI Reports and Power Apps to engage the virtual space with business processes and line of business applications. Microsoft Loop support will be there via Whiteboards, but perhaps via other documents as well will enable to go fully hybrid where information flows between the physical and virtual world and other participants working on the subject.
- Mesh enables participants movement, interactions, collaboration and keeps the visual environment in sync so everyone shares the experiences simultaneously and can enjoy real-time gestures and changes in the environment. This is of course something that is done by any good-enough virtual environment today and it is one of requirements (along with spatial audio and interactions) for the virtual space to be called immersive and collaborative.
Mesh enables to bring in and share visual 3D content or information. This could be models, objects and more complex 3D content with process and animation is supported. That way the environment and content can be the inspiration or reference to collaboration. It doesn’t make sense to bring all content to 3D space in 2D format (like 2D reports and dashboards) but instead it is much better to bring in information presented in 3D and visual way. In the article GCV had marine and mangrove forest environments where participants could experience and feel the importance of those ecosystems by making them to experience and see it from inside, in the immersive way. The experience is surely something else than just reading it off the paper or watching a video. I especially loved this part on Mesh capabilities:
I have really good vibes that Microsoft Mesh is going to change the way we think about how Metaverse and Immersive Spaces can be utilized and what we can do with them. Immersive experience, being there on the location virtually, gives us much more empathy as well us understanding when we can interact with the environment visually. We can participate in those spaces easily from Microsoft Teams using any device: Virtual Reality headsets, PC or mobile devices. Integrated to Microsoft 365 it also brings in enterprise class security and safety for content and people.
Read Microsoft article Microsoft Mesh: Creating connections at the World Economic Forum 2023.
Microsoft Mesh pictures in this post are from that article.
2 thoughts on “Microsoft Mesh today”
Thanks Mr. Metaverse ,
I’ve been a Softie for years . The stuff MSft is doing with Mesh, Designer , and Teams is mind blowing .
I would have loved to be at Davis. You are spot on when you talk about. “Visualizing Complex Processes “. . Reach out
Thanks for sharing!