metaverse(s): beyond my mixed feelings

Zuckerberg said the metaverse will be an embodied internet. Is this what we want?

‘The Internet’, besides the numerous good things that it brings, is also the echo chamber of racism, polarization, radicalization, and sexism. It is naive to think that the metaverse(s) ecosystem would not carry the biases of today’s Internet since they both share the same enablers: same underlying technologies, and same players.

What is the Metaverse, and why should we care?

‘Meta’ — ‘above and beyond’ — and ‘verse’ (short for the universe) — all the things that exist in the cosmos including space and time, the ‘Metaverse(s)’ represents the aggregation of virtual and semi-virtual environments accessible from virtual and real spaces where people (or their avatars) will work, shop, have fun, work out, socialize, relax, learn, etc.

Many factors come into play to enable this metaverse, including hardware, such as virtual reality (VR) headset, infrastructure, such as 5G, and technologies, such as machine learning algorithms, computer vision, data collection, data analysis, etc.

The way these technologies materialize today are limited to a 2D, non immersive experience, but they have a non-negligible impact on an individual to societal level in our real lives, and they are increasingly powerful, automated, ubiquitous and intrusive. How will they impact our lives through a medium that goes beyond a screen?

This essay is a reflection on what it takes to design the metaverse ethically and sheds light on who is catalyzing this paradigm change.

Bias in design

It is only a recent conversation that one-size-fits-all design, whether it lies in things, clothing, medicine pills or software, is dysfunctional, if not dangerous.

For example, did you know that even the most common items we use day-to-day, such as a seatbelt or a defibrillator, are dangerous for most female users (Invisible Women, Caroline Criado Perez)? This is not rooted in a desire or strategy to exclude, rather in the inheritance of the pursuit of perfect proportions kicked off by Da Vinci’s Vitruvian Man, then later emphasized by the philosophy of The Bauhaus.

The obsession for harmony combined with the capacity to mass-produce, led to the standardization of architecture, tools, fonts, object design, etc. The human proportions used for reference were the one of a healthy young caucasian man (fun fact: Le Corbusier’s Modulor’s model was 1,75m tall, also the average height of Frenchmen at that time).

The different methods used to represent the body reveal that the ‘human figure ’is gender and race specific: male and white. […] The illustrations dimension the body in several positions, but only one type of body is shown. Historically, when a single body is proposed to represent all people, the body is male. Such standards remain firmly embedded in modern architecture, in the dimensions, connections, and ideas of minimal and efficient space and in the regulations that control them”. — Lance Hosey, Hidden Lines: Gender, Race, and the Body in Graphic Standards. Journal of Architectural Education. November 2001.

It’s only in the 1960ies that the standardized and average design mindset started to shift, however, the bias is still embedded in most design processes, hardware, and technologies — including those that power the metaverse.

For example, virtual reality headsets are not suitable for many black hair types and hairstyles, voice recognition systems are gender biased (they have higher error rates with higher pitch voice than with lower pitch voice inputs), facial recognition technologies recognition are biased against darker skin tones.

Screenshot of a recent Reddit post “how to wear a rift s headset with big afro?” (Source: Reddit)

Unappropriate design, including biased software, can be harmful if not dangerous — let’s not remind ourselves how Tay became antisemitic, racist and misogynistic in less than 24 hours learning by interacting on Twitter, how large technology companies shut down facial recognition systems because of their bias against darker skins or used to execute surveillance and ethnical cleansing at scale.

Correcting bias is hard if not impossible. But when inclusion is a core principle of the design process, bias can not only be avoided but benefits end up making life easier for everyone — this is also known as the ‘curb-cut’ effect.

The user and the product

Designing for every requires the intent to do so, a robust inclusive methodology and a diverse team, but most importantly, it requires designing for the user. The way social media apps have been built is far from putting people at the center of their design — the data points people generate while using the apps are only used as a source of monetization.

The values promoted by the tech revolutions are very similar to the ones promoted by the industrial revolution: it’s extractive, individualistic, and short-sighted. Massive digital extraction started with Larry Page’s vision — technologies enabling the capture, store, search and display data — will become increasingly cheap, making it easier to document users’ behaviors, and turn them into profitable products. Here is an extract of Shoshannah Zubboff’s latest op-ed:

When asked “What is Google?” the co-founder Larry Page laid it out in 2001, according to a detailed account by Douglas Edwards, Google’s first brand manager, in his book “I’m Feeling Lucky”: “Storage is cheap. Cameras are cheap. People will generate enormous amounts of data,” Mr. Page said. “Everything you’ve ever heard or seen or experienced will become searchable. Your whole life will be searchable.”

Instead of selling search to users, Google survived by turning its search engine into a sophisticated surveillance medium for seizing human data. Company executives worked to keep these economic operations secret, hidden from users, lawmakers, and competitors. Mr. Page opposed anything that might “stir the privacy pot and endanger our ability to gather data,” Mr. Edwards wrote. Source: The New York Times

It’s only in the past decade that public opinion, along with regulators, has started to grasp the scale of social networks’ unintended consequences: flinted elections, polarization, harm to teen body image, user manipulation through positive reinforcement, and mode.

This is only using behavioral and personal data gathered from the internet and smartphones and apps. How will biometric and volumetric data and immersive experiences affect people’s behaviors in the metaverse, and in the real world?

Creating harmless spaces in the meta world

Speaking of behavioral change, some metaversical features might actually help move the needle towards by enforcing mutual respect, with real-virtual consequences — that usually don’t happen enough IRL — on those that fail to comply with the rules.

Trolling & Harassment

Harassment is a problem taken seriously in the VR world, especially amongst multiplayer platforms. In 2021, 83% of multiplayer VR gamers have experienced some form of harassment, 49% of women have experienced harassment in virtual reality, and 30% of men report having experienced homophobic or racist comments.

Harassment (and fear of harassment) in VR comes in many forms. Verbal, physical and spatial abuse can violate different interaction spaces, from the intimate, personal, social, to the public space. Because the sense of immersion and interactions in VR is much greater than on the internet, this presents another challenge. By design, it’s difficult to transpose the same moderation and banning principles typically used for 2D platforms and applications.

Anti-Harassment Features

Platforms increasingly include features and design tweaks to prevent harassment and trolling. AltspaceVR, now part of Microsoft, features a personal bubble active by default to prevent other avatars from getting too close or handsy (this Reddit thread of 250+ comments led AltspaceVR to take action in 2016).

AltVR Personal Bubble

EchoVR lets users customize social preferences during onboarding. Meta’s Horizon lets users ‘escape’ an uncomfortable or threatening situation by pressing an easily accessible blue shield button. The user is taken to a safe zone and can report, mute or block another user.

The Power of the Community

Moderation is another way to tackle harassment. EchoVR relies on the community to create rules and self moderate behaviors. Support is also available by email to report users who violate the rules and code of conduct. Unity, through the acquisition of Oto, leverages sentiment analysis — a subset of natural language understanding powered by machine learning — to identify verbal hate speech and cross-gender discrimination.

For now, features and moderation are primarily reactive. The gaming community encourages new users to join private spaces and report abusive players to discourage offensive behaviors. Some platforms also feature ways to customize avatar characteristics, like EchoVR which lets players deepen the pitch of their voice.

Games, online forums and now virtual communities are a way for underrepresented minorities to connect while protecting themselves. VR has been useful for some people to alleviate gender dysphoria and meet a large trans community in VRChat for example.

Creating safe spaces with values of mutual respect is one thing, but enabling everyone to access these spaces in the first place is a must-have. This deserves more than a sentence — Ristband’s report on diversity and inclusion in virtual spaces written by Anne McKinnon is absolutely worth a read.

Catalyzing change

I have come to the conclusion that there are three pillars to catalyze change:

  1. Capital — allocating funds towards initiatives and companies that embed diversity, inclusion, and user-centricity principles and actions
  2. Representation — influencing businesses, the public and regulators
  3. Regulation — regulating the tech industry with a framework that puts people before profit

Here is a list of the initiatives and people who are curbing the way funds are being allocated, amplifying underrepresented people’s voices and interests in the tech space, and setting the safeguards to protect people.

Capital allocation

Representation

People

Communities

  • XR women, the community for women in XR and allies founded by Julie Smithson, Karen Alexander, Sarah Klein and Sophia Moshasha

Storytelling

Technology

  • LaJuné McMillian, working on increasing Black body and movement representation
  • Researchers at Data Science Nigeria are working on generating synthetic data of African fashion, after noticing the wealth of data set featuring mostly western items.

Platforms

  • Praxis Labs — a Black-founded virtual reality-based diversity and inclusion learning platform designed to redefine work cultures

Regulation

  • XR Safety Initiative (XRSI) is a 501(c)(3) worldwide not-for-profit organization that promotes privacy, security, and ethics in the immersive environments
  • Joy Boulowimi’s Algorithmic Justice League
  • Shoshannah Zubboff on surveillance capitalism and Max Schrems, who is pushing for regulation and user protection in Europe

Closing thoughts

Implementing more diverse teams, designing for accessibility, enforcing cross-platform user safety and data privacy, and finding a balance between user privacy and moderation: these steps can make a huge impact on people’s lives and even amplify the positive impact of immersive technology.

Change is on the way, and what gives me hope is seeing an increasingly diverse number of influential voices in the Metaverse world.

--

--

--

Observing technology, cultures, identities, and other random brain picking things.

Love podcasts or audiobooks? Learn on the go with our new app.

Recommended from Medium

Top 10 Squarespace Portfolio Templates

Top 10 Squarespace Portfolio Templates

Colour Coding — Part 1

The Need For New Ways To Approach Challenges

My Logo Design process

An Exploring Holidays Podcast

Week #5: Editorial Design

Design Systems.

How to Design a Google Font

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store
Charlie Mellow

Charlie Mellow

Observing technology, cultures, identities, and other random brain picking things.

More from Medium

The Devil Wears Prada

On having a home

Why Anthropology is such an Underrated Degree

The “Woke Agenda” in Video Games — and Why I Love It