Tue. Jul 5th, 2022

Most everyone in the metaverse, from tech companies to retailers to content producers to investors, agrees that the metaverse is ripe for the taking.

But some psychologists and mental health experts say that the race to profit is obscuring a critical question: Will the metaverse, especially for children and teens, be a safe place?

The answer provided is not comforting. Research shows that social media has numerous negative effects on children and adolescents’ minds, such as the prevalence of bullying and harassment and issues with self-esteem and body image. When it comes to the wide-open metaverse and its vast virtual worlds intended for both work and play, these same dangers could be just as prevalent, if not worse.

Tech companies may benefit children’s mental health if they take these concerns seriously and build solutions into their metaverse products, some experts say,

All of these new tools and possibilities could be used for good or evil, says clinical psychologist Mitch Prinstein, who is also the American Psychological Association’s chief science officer.

Worse than social media, perhaps?

Some children and teenagers are already being harmed by today’s social media platforms. Albert “Skip” Rizzo, a psychologist who serves as the director for medical virtual reality at USC’s Institute for Creative Technologies, says that virtual reality’s level of immersion could make those problems even worse.

A potency to being immersed in a world rather than just watching and interacting with it on a flat screen monitor exists, according to Rizzo. We can be exposed to things that are realistic enough to be psychologically damaging, even though we can’t be physically touched while we’re in the space where we’ve been embodied.

As a result, people are feeling even more isolated. Concerns about body image and exposure to suicidal content are increasing as a result of this.

The American Psychological Association’s Chief Science Officer

Prinstein says that the ability to create a virtual self that differs from your real-life image can be “pretty dangerous for adolescents, in particular,” when used in the metaverse with 3D digital avatars.

As a teenager, “you are what other people think of you,” he says. A teenager’s identity can be shattered by the idea of being able to fabricate one’s own identity and receive different feedback.

An account of my rapid ascent to dogecoin billionaire status

At a critical juncture in these young people’s mental and emotional development, Prinstein fears that tech companies are exploiting their vulnerability by marketing to them through social media and metaverse platforms.

As he puts it, “this is just an exacerbation of the problems that we’ve already begun to see with the effects of social media.” “It’s causing more isolation. Concerns about body image and exposure to suicidal content are on the rise because of this.

There are already a few issues.

A virtual reality social network called Horizon Worlds was released by Meta in December. Microsoft launched a cloud-based 3D business meeting service in March. In the metaverse, popular online games like Roblox and Epic Games are helping other companies like these like Roblox and Epic Games establish footholds.

There is already evidence that one such game publisher, VRChat, is dangerous for young users. On VRChat’s platform, which is typically accessed through Meta’s Oculus headsets, the nonprofit Center for Countering Digital Hate (CCDH) conducted research in December and found that minors were regularly exposed to graphic sexual content, racist and violent language, bullying and other forms of harassment.

“Do I feel safe knowing that Mark Zuckerberg is the guy in charge of deciding who influences my children?” I think parents will be asking themselves.

Center for Combating Digital Hatred’s Chief Executive Officer

A wide range of negative behaviours are not permitted on the VR platforms provided by Meta and Oculus, respectively. An Oculus spokesperson pointed CNBC Make It to the company’s previous statements about building a “responsibly” metaverse and the Oculus platform’s tools for reporting abuse and blocking other users when contacted by the news outlet for a comment. An immediate response from VRChat was not immediately available to CNBC Make It.

According to Imran Ahmed, CEO of the CCDH: Even with the best of intentions, enforcing and monitoring safety policies in virtual environments can be a challenge.

When it comes to safety in virtual reality, “you can’t search [the metaverse] for hate or sexual abuse,” he explains. “It’s impossible. There’s nothing you can do about it because it happens so quickly.

A parent’s responsibility is to monitor their children’s access to the metaverse, as Ahmed predicts. According to him, “I think parents will ask themselves: Do I feel safe knowing that Mark Zuckerberg is the guy in charge of deciding who influences my children, who may be able to bully them, and whether or not they are safe in cyberspace?” he says.

Because they have a financial incentive to do so,

However, virtual reality and metaverses have enormous potential to improve mental health for their users. Researchers like Rizzo at USC have found that virtual reality treatments can help patients develop empathy and cope with issues like post-traumatic stress disorder (PTSD).

Rizzo and Prinstein, on the other hand, agree that it is the responsibility of technology companies to put the safety of their users ahead of their own financial interests.

Ahmed says tech companies could use tools to ensure the safety of young users in the metaverse, including strict age verification tools to prevent predators from posing as younger users, a large number of content moderators and “rapid response” when users report violations of inappropriate behaviour.

‘There’s no reason why there couldn’t be the presence of moderators in spaces where children are present [or] virtual chaperones,” he explains. …But money is required for that, of course.”

Companies should be rewarded if they use these cutting-edge tools for the greater good. They’re motivated to make a profit at this point in time.

The American Psychological Association’s Chief Science Officer

Parenting children in the metaverse may also be too much for many parents, who have “relatively little personal experience with understanding these platforms,” says Prinstein.

Companies should be “incentivized” to use their “brilliant tools to actually improve society,” he says. “At the moment, they’re motivated to make money.” “

By Adam

If you want to contribute kindly contact at [email protected] or [email protected] also you can buy guest posts from our other different sites and write post for us.

Leave a Reply

Your email address will not be published.