Interested in learning what's next for the gaming industry? Join gaming executives to discuss emerging parts of the industry this October at GamesBeat Summit Next. Learn more.
The Oasis Consortium, a nonprofit devoted to making the internet safer, launched its principles for online user safety today.
Tiffany Xingyu Wang, president of Oasis, created the organization to gather thought leaders across social media, gaming, and dating to accelerate the development of an “ethical internet.” The group has unveiled its operating principles, dubbed User Safety Standards for our Digital Future. (Wang will be a speaker at our GamesBeat Summit: Into the Metaverse 2 online event on January 26-27).
On the one-year anniversary of the Capitol insurrection, The group focused on existential threats and wants to put safety guardrails at the core of online communication. This will be more important as industries shift toward Web 3 (the decentralized internet) and the metaverse, the group said.
The Standards are the first output of the think tank, launched in August to establish and popularize a new digital sustainability model for business in a Web 3 world. The consortium members include Riot Games, Pandora, The Meet Group, and others.
They worked in concert to create the best practices, and the work involved hundreds of conversations over months with professionals across gaming, dating, and social apps.
The group suggests that companies conduct an internal user safety assessment to measure performance and identify ways to improve. It also said companies will be able to earn a certification for Oasis Digital
Sustainability in User Safety.
Companies should recognize that user safety should be a company-wide initiative and should have an executive-level champion and have accountability for both vision and execution. It should be reflected in product design and company workflows, and it should have a budget.
“User safety is a challenge that continues to evolve, along with user behaviors, world events, and technical capabilities,” the group said.
Oasis also said companies should develop a living roadmap to continue iteration and improvement, and they should plan for user safety proactively, not reactively.
Policies should be based on representation, learning, and wellness. The team that develops policies and enforcement must be diverse in every dimension, especially the social background, the group said.
It noted that moderators and other employees are exposed to the worst of humanity under strict productivity goals. Companies should provide resources and design programs to protect and improve the wellness of their teams.
Companies should also be aware of both local and international regulations that require companies to understand, record, and report what they’re doing on user safety. To help, companies will often need to seek outside opinions or partners on trust and safety. That helps with accountability.
Wang said in an interview with GamesBeat that preparation for the consortium began around August 2020. Then the consortium formally got off in the ground in August 2021, and it began work on the standards.
The intention was always to go for something larger than just gaming, which is served by organizations like the Fair Play Alliance, Wang said. She noted that if tech platforms do not address fundamental issues for user safety, then it becomes the responsibility of others like brands or those who actually implement the guard rails to help. The user safety topic is also bigger than the anti-toxicity issues that is big topic to address with games.
“Our intention is to make this available to as many companies as possible,” Wang said.
Wang said the organization’s pillar is to focus on safety by design, privacy by design, and inclusion by design. And that’s because things like harassment and hate speech are rooted in a lack of consideration for inclusion.
And there are challenges to balancing this different pillars. I can think of one example. Activision recently launched its Ricochet anti-cheat technology for Call of Duty: Vanguard. That helps reduce the rampant cheaters by identifying the computers used to cheat and banning them, not just banning individual accounts that can be quickly replaced with a new account.
Activision does this by using high-level access to the operating system of the computer to verify whether a computer has been used to cheat before. But this requires a level of trust and disclosure, as that high-level access can also be a privacy concern. Wang agree that if you want to battle toxicity, you have to be careful to do it in a privacy-perserving way.
“We are actively standing up a privacy advisory board,” she said.
Oasis is also starting to work with partners and other like-minded organizations. And it will determine how to proceed with its certification and auditing process.
GamesBeat's creed when covering the game industry is "where passion meets business." What does this mean? We want to tell you how the news matters to you -- not just as a decision-maker at a game studio, but also as a fan of games. Whether you read our articles, listen to our podcasts, or watch our videos, GamesBeat will help you learn about the industry and enjoy engaging with it. Learn more about membership.