The power of the metaverse and the benefits of the virtual worlds are undeniable, yet the damage they could cause, if left to develop unchecked, is extremely worrying. The urgent question for now is, can the metaverse avoid the ills of social media such as disinformation, personal data safety, toxicity, and emotional manipulation?
The Metaverse is a 3D, fully immersive, virtual representation of the world that “represents a broad shift in how we interact with technology.” Unlike most existing games and simulations, a metaverse is persistent. Like a public street or a neighborhood, it’s there even when you’re not, and things keep happening.
Last October, Facebook announced it was changing its name to Meta, signaling a full embrace of their belief in the world’s metaverse future. At the time, many critics feared that the change was a tactical distraction from the many harms caused by the company’s profit-driven decisions and that Facebook’s new immersive platform would only exacerbate its existing security flaws if left unregulated.
Indeed, with the rollout of the metaverse, we are ushering in a new era of mass customization of influence and manipulation. The metaverse will provide a powerful set of tools to manipulate us effectively and efficiently. What’s even more remarkable is the ability to combine individual personalization and mass manipulation in a way that has never been possible before.
Metaverses are envisioned as immersive digital environments that persist even when users are disconnected, using augmented and virtual reality. They are likely to mimic the real world to some extent and virtually recreate certain experiences such as shopping and professional and social gatherings.
Consider the following plausible scenario which could soon play out in the metaverse, the online virtual reality environments that Mark Zuckerberg and other tech entrepreneurs are rapidly developing: A political candidate gives a speech to millions of people. While each viewer thinks they’re seeing the same version of the candidate, in virtual reality, everyone is actually seeing a slightly different version. For each viewer, the candidate’s face has been subtly altered to look like him or her.
This is done by blending features of each viewer’s face into the candidate’s face. The viewers are unaware of any manipulation of the image. Yet they are strongly influenced by it: Each member of the audience is more favorably disposed to the candidate than they would have been without any digital manipulation.
This is not speculation. It has long been known that mimicry can be exploited as a powerful tool for influence. A series of experiments by Stanford researchers have shown that slightly changing the features of an unfamiliar political figure to resemble each voter made people rate politicians more favorably.
To date, the fight against disinformation has focused on its dissemination through social media platforms and sites that masquerade as news publications. But as we know, communication channels are evolving and so are the tools of disinformation. Many who study this space predict that disinformation will transition to a new arena: the metaverse.
“The problems we have regulating technology companies now will be reproduced and amplified in the metaverse. You think policing state-sponsored disinformation is hard on Facebook and Twitter?” political strategist and former Illinois Deputy Gov. Bradley Tusk wrote. “…How do we ensure accurate information prevails, especially in a context where the alteration of reality is the point? Life on the metaverse will not look or feel like real life, and that’s by design. So how do we keep people safe?”
Another type of global, ” always on“ digital realm may provide clues to how narratives might spread in the metaverse: online multiplayer video games. These games allow players to interact in real-time with other players around the world, creating a kind of large-scale online forum.
Emotional manipulation is another powerful danger. Virtual reality environments, such as Facebook’s metaverse will allow psychological and emotional manipulation of their users on a level unimaginable in today’s media.
The very same features that make virtual reality environments so attractive as communication environments — the sense that you’ve teleported into a synthetic world — can also harm their users. When it comes to emotional manipulation, two features of the metaverse are particularly important — presence and embodiment.
“Presence” means that people feel they are communicating with one another directly without any type of computer interface. “Embodiment” means that the user has the feeling that their avatar or virtual body is their actual body.
Even in virtual reality’s current, primitive state, these two sensations are what make VR so powerful. They are also what makes emotional manipulation in VR so dangerous.
In brief, society only started to get serious about traditional social media when things got completely out of hand. We must not wait until these technologies are fully realized to consider appropriate guardrails for them. We can reap the benefits of the metaverse while minimizing its potential for great harm.
We are working hard to bring you the latest fact-checked information and tools. Donate every time you read disinformation and the money will be used to pay a fact-checking ad!
Make a one-time donation
Make a monthly donation
Make a yearly donation
Choose an amount
Or enter a custom amount
Your contribution is appreciated.
Your contribution is appreciated.
Your contribution is appreciated.DonateDonate monthlyDonate yearly