Metaverse increases equality, accessibility and inclusivity

Welcome to the Year 2022! I hope it is going to be a happy and wonderful one for You! Past couple of years have had a plenty of room for improvement, including the number of safe in-person meetings and events. As it won’t be always possible, or reasonable / money-effective, the Immersive Metaverse will be starting to see the light more and more often this year. We have to rethink how we work in the future and where in-person meetings are relevant / needed. Hybrid meetings are going to be quite a norm even after pandemic; not everyone is going to be present physically. This means that we have to have new tools, new practices and ways to work together no matter where people are. With this mind: 2022 will be the Year of the Metaverse emerging.

I had a very interesting conversation about Metaverse with one person during December 2021. We talked several aspects of Metaverse, Microsoft Metaverse and how all this will change the way we work and meet in the future, but something especially got my attention: meetings in the immersive 3D Metaverse will increase accessibility, equality and inclusivity more than our traditional 2D meetings with cameras on.

Everyone will be equal in the immersive Metaverse as the person is represented by an avatar – their Digital Twin.

That is our digital twin in some cases, but it doesn’t mean that avatar needs to be exact replica of us. My avatar in AltSpace VR had a purple skin because I was “Mr. Teams” there. Some people found that odd, but with my role-playing games background it was very natural, after all in the Skyrim and other games you might be representing a different gender or pick a race than just a variation of human. Skin color is just a minor detail there and doesn’t affect how the game continues. Very often we build our avatar to look like us, but it doesn’t have to be always so.

Accessibility and Inclusivity

With previous things in the mind – our avatar doesn’t have to replicate our physical qualities or surface. People in wheel chairs can walk, facial features are not reflected, skin color and hair style can be adjusted as we like to. Physical disabilities are not reflected to the Metaverse. Others don’t know if you are using accessibility aids or different devices to move your virtual avatar in the meeting.

Photo by ELEVATE on Pexels.com

It is not just about your physical qualities – how you dress or make up doesn’t matter either. You might have had a rough exercise just minutes before the meeting started. Yes, you could join a 2D meeting with camera off but there might be more peer pressure to turn it on.

There are already tools that can create an avatar, with some 3D elements, from your selfie. I really hope Microsoft will add automatically (option by admins ) to create 3D avatars for people based on their Microsoft Teams picture. This would add for inclusivity: you don’t have feel left out or be inferior since you haven’t worked on your avatar. Other option should be to upload a picture to create your company avatar – alongside with your profile photo. Editing your avatar is also important: you may want to express yourself in a certain way. Not all avatars need to wear a suit – or perhaps you may want it to wear one while sitting at home office chair on a t-shirt..

Accessibility is a big topic and must-have for any new technology. We all are going to need some accessibility aids at some point of our life. It may be just glasses, but in the global world is starts with ability to join the meeting from anywhere, understanding content and conversation and being able to provide your insights to others. The first thing emerging to my mind is “the universal translator”; simultaneous captioning and translation of speech. Second one is ability to have option to translate visible texts (or documents) in the Metaverse to your own language. And use your native tongue to provide your inputs. As I am not a native English speaker I have already found out that Live Captions do help to follow up some events, especially when there is any background noise. Coming from Finland I have gotten used to subtitled movies and series so reading captions is very natural to me.

Implementing a simultaneous sign language avatar next to you will also be possible – with subtitles we already know what needs to be translated – for those situations where it is needed. Usually subtitles+ live translation (with individual option for contrast and text size) do the job but there may be needs where following the sign language avatar might be better: perhaps when the Metaverse meeting is projected to a large screen and lots of people are present physically in the room.

What about if you have more severe sight disabilities or are blind? I hope that Metaverse developers take this into account and make it possible to have good support with audio guides helping you along the metaverse world. Compared to the 2D meeting you need to be able to navigate yourself to a location you want to. Audio guide could be used to describe surroundings and teleport you to interesting hotspots. Live Caption with translation could perhaps mute other audios and speak translation to you. 2D meetings should add this capability as well – it is not something limited to the immersive Metaverse.

And speaking of speaking translations can be extended to speaking out any content to you (and only to you) but as well to text-to-speech to others. This would enable hard-on-speaking to participate better in the Metaverse, and others would not even have to know about that disability. Personally I have been on many meetings on too late hours when I can’t speak out loud at home (others are sleeping) and typing text in Teams chat doesn’t always get the attention it should. That is why this is an another ask for the Metaverse developers: please add the ability to create avatar’s voice and let me type what my avatar says to others. Perhaps even with translation. This would be a good feature to Teams meetings also.

Photo by Tirachard Kumtanom on Pexels.com

Other accessibility aids could be implemented to the metaverse as well: magnifying of texts, zooming, changing contrast and so on. How about audio accessibility? Spatial 3D audio settings need to be adjustable, so I could concentrate more easily on people near me than hearing the background noise and talk. Using spatial audio and distance this would very easy to taken into account if developers so choose to. You don’t even need to have hearing issues to benefit from this one: A crowded AltSpace VR meeting in a single room can be too noisy and cacophonic today. I am counting that in the future in Microsoft Metaverse this is not an issue – and perhaps I can tune levels myself as well. When talking about hybrid world – combining physical and virtual worlds there are currently gaps in 3D audio, but I think that will be an another future blog post topic where I dive into that.

No, not all people can afford or can use VR headsets to join the meeting. Joining via Teams using your computer or mobile device removes that requirement. You can be a full member of the team despite not having the latest and most expensive gear in the use. Teams did this already to lots of people – ability to join equally to meetings from other locations – but Metaverse can take this even further by allowing that virtual immersive presence.

Equality

In this point we can see Metaverse is taking steps forward by adding possibilities to improve accessibility and also inclusivity. Making people equal in the immersive Metaverse without having to think their looks, color or other aspects. This might help younger people and especially women. If avatar’s have audio profile it would be useful to mask out person’s gender or ethnicity and it would be very useful when removing presumptions or biased valuation. Using AltSpace VR as example: you can create your avatar any gender or color you like. Remember: I had purple skin in AltSpace.

Does this mean everyone would have to be transgenered and equalized? Of course not! Would it make it possible that someone is pretending to be someone else in the metaverse? Yes, it can. That is why there needs to be other ways to ensure about the person’s identity that you can’t be faking to be someone else. But also: this would be a step in equality on certain situations. We could be focusing on the goal, instead of person or some physical traits. Perhaps I would have my real identity (and avatar representing the real me) in company events and meetings but when joining the public event I might want to mask myself. This is creating me some mixed feelings about what is ethical and how these should be applied. We have known, since the popularization of the internet, that people can be something else than what they represent: scammers, identity thieves and others. I don’t think we want that in the Metaverse, but it may not be avoidable in public verses. Inside and between organizations people need to be verified by tech.

Identity and representation will be a talking topic when Metaverse becomes more commonplace in businesses.

Photo by Pixabay on Pexels.com

Talking about physical, the big equality step is that we can meet with people in the Metaverse no matter where that person is in physically. Location doesn’t matter, just like it doesn’t matter in 2D Teams meetings and events today. Unlike 2D Hollywood squares we would have more presence and immersive experience and the feeling of “being there with others” in the immersive Metaverse. Events, meetings, hybrid setup and others will be adding to this Future of Work.

In the Metaverse we all can be equal and get our voice heard and presence seen.

I think that Microsoft Teams will be making Immersive Metaverse very popular – people have a very low-threshold step to attend 3D and it being embedded in the tool. You could say Teams will be democratizing Metaverse and make it widely used because of increasing easy inclusivity and accessibility to meetings with shared experiences.

5 thoughts on “Metaverse increases equality, accessibility and inclusivity

Leave a comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.