No one will want to spend long periods of time in virtual worlds if the level of harassment is as bad as it is in our real-world environments. According to Spectrum Labs’ chief strategy and marketing officer, Tiffany Xingyu Wang, harassment and personal attacks that 41% of US internet users have experienced online will only get worse in virtual worlds. Name-calling, embarrassment, and physical threats are all included in this category.
As she put it, “the metaverse is an immersive and multisensory experience, which magnifies the impact.” A shorter period of time is required for toxicity to develop.
how to build Welcoming Virtual Worlds
If legacy social media platforms can’t address the issue of online harassment, Wang believes it’s too late. As the number of virtual worlds increases, she believes that the technology industry has a chance to get it right this time. Competitive advantage in the metaverse will be gained by companies that build safe and responsible communities.
Using safety by design as a differentiator can help companies attract new employees, she said. As a result of their superior performance in these areas, “new platforms will win.”
There must be basic measures in place to ensure the personal safety of metaverse residents, in addition to safeguarding them against theft and financial scams.
It is imperative that a shared code of conduct be established, complete with penalties, in order to avoid a recurrence of the Web 2.0 issues. As a trusted advisor in the fields of social media, dating, gaming, commerce, and education technology, Spectrum works with clients to build AI infrastructure that fosters user confidence and security. Hate speech, racism, and child sex abuse are just a few of the things that customers report using the content moderation tools provided by the company.
A community policy is more than an insurance policy or an inevitable cost centre, according to Wang.
People may choose to come to your community for this reason, which lowers customer acquisition costs and boosts client loyalty, according to the author. “Trust and safety have a business case.”
How to create a policy for your community
Bully on smartphone harasses, threatens, and intimidates upset victim on the internet. Bully in smartphone. The concept of online harassment, online flooding, and cyberbullying. Isolated illustration of a pinkish coral blue colour.
Writing and enforcing clear rules of conduct is essential for any company, individual, or organisation operating an immersive community online, according to Wang. Doing both tasks well is essential.
Companies have policies in place, but they don’t enforce them, she said.
TSPA is a non-profit organisation for those in the field of establishing and enforcing rules and guidelines for what constitutes acceptable conduct and content on the internet.
There has been a lot of interest in the Trust and Safety Foundation’s work in this area since the pandemic, according to Wang.
Writing a code of conduct that is specific to the platform’s users is another important factor in achieving success. For example, policies for an LGBTQ dating site would be very different from those for a gaming site geared toward children aged 9 to 13.
Companies that do this well have a policy team that collaborates with marketing and communications to write a policy that reflects the brand’s identity, she said.
According to Wang, the community’s policies also dictate the appropriate punishments for those who break them.
It’s impossible to create enforceable regulations without first understanding the “fundamental code of conduct.”
A new augmented reality partnership between Qualcomm and Lenovo has been announced, which will allow for the creation of a larger developer community.
Artificial intelligence enforces and automates Spectrum’s moderation rules. As a result, moderators are less likely to be exposed to violent or otherwise offensive material.
Wang also stressed the importance of openness in creating a secure online community.
“If you take action, if you suspend a user or you take a person off the platform, you have to tie it back to policy,” she said.
Wang recommended a new white paper from Grindr as a good example of how to develop content moderation strategies. Three trust and safety leaders at Grindr wrote the paper “Best practises for gender inclusive content moderation,” which examines various content moderation decisions and the multiple factors to consider when setting policies.
Trust and safety experts Vanity Brown and Lily Galib, and Alice Hunsberger, the senior director of customer experience at Grindr, explain how to design comprehensive policies, review best practises for inclusive moderation and offer resources for moderators and users.
Who sets the rules in virtual worlds?
Hands holding a tree growing on coins Invest in stocks, save money, grow financially 3d illustration metaverse Virtual World
That need for a shared code of conduct is related to the security problem: Who’s in charge here? As Ahmer Inam, chief AI officer at Pactera Edge describes it, part of the challenge is that there is no clear enforcing entity for metaverse rules.
“Virtual worlds are completely borderless, so whose laws apply?” he said
No one wants to extend the authoritarian tendencies of many governments to any multiverse world. However, if theft and scams and general criminality are everywhere in these virtual places, that also will discourage wider adoption.
Inam thinks a partnership between public and private entities should develop a shared set of rules about any metaverse world.
He sees potential in learning from concerns about nuclear power that led to international treaties to govern the technology with trustworthiness as a central organising principle.
Metaverse cheat sheet: Everything you need to know
Inam predicts that once regulated industries start to build metaverse experiences, regulations will come faster. He also thinks that ethical AI charters could be expanded to cover the metaverse, but that will require a partnership between industry and government.
“The growth of the metaverse could accelerate not just the tech but an individual bill of rights and privacy because of the potential for harm that exists,” he said. “As a technologist, I’m excited for what’s to come, but as a citizen of our society, I’m a little bit concerned.”
The pandemic showed that internet connectivity is as important as electricity, and prompted government investment to expand access to communities with no high-speed access. James Arlen, CISO at database-as-a-service company Aiven, thinks something similar will happen with digital identities.