Presented by Two Hat Security
Player behavior can make or break your game and fostering positive behaviors within a gaming community is an essential part of growing your brand and retaining gamers online. To address this head on, game developers must step back and look critically at what the root causes of negative behaviors are.
This means developers must make intentional design decisions, provide a safe and welcoming environment, remind users and players of community norms, utilize customizability, and amplify the social stickiness of social interactions in games.
And as millions of Americans are stuck at home amid the COVID-19 pandemic, the goal of influencing these positive gamer behaviors online is more important than ever. School and occupational closures along with strict containment measures mean more and more people are relying on technology and digital solutions for entertainment, information, and connectivity — and video gaming has become the perfect solution for many.
Social gaming and virtual worlds are bridging the gap by providing the experiences and interactivity that people across the world are currently craving. So how can game developers continue to encourage these positive behaviors as traffic continues to increase? This influx of activity should not threaten positive gaming experiences for games, and it is up to game developers to continue improving their moderation capabilities.
Chat volumes are up significantly
Between January 3 and April 7, 2020, chat among cross-platform games, mobile games, kids’ platforms, teen social networks, and virtual worlds increased dramatically week over week. In fact, some Two Hat clients have experienced a 40%, 100%, and even 3,000% increase in chat, when comparing March and February.
During these times, there may also be an increase in negative bullying chat and even Child Sexual Abuse Material (CSAM) or grooming. You can imagine the friction and strain this causes a moderation team. If there is a small team responsible for moderation, their workflow is doubling or tripling almost overnight.
Moderation techniques are needed to manage increased volumes
To manage these increased content volumes, game developers are faced with a number of challenges. Human moderation teams can handle only so much and can easily miss negative content on their sites or game platforms. The following techniques to manage these increased volumes will help your team handle workloads better, reduce the amount of manual labour needed, and prioritize negative content.
Reduce manual moderation
First and foremost, it is important to reduce your reliance on manual moderation. Developers can do this by surfacing community guidelines as part of the experience whenever the user logs in. By providing a simple mandatory button, the user must click and agree to the guidelines before chatting in the community. You can also implement warning messages whenever the system detects a user is trying to post content that breaches your community guidelines (like harassment or dangerous/hateful speech). And using messaging to reiterate warnings that users who submit false reports may face sanctions themselves will reduce the number of false claims your team has to investigate.
Sensitive content should be escalated
During this crisis, users are experiencing a vast range of negative life experiences. In many cases, users may feel the need to express themselves and their feelings through your platform, but it is critical that you strike a balance between safety and expression. Watch for threats of self-harm or other online harms suggested in the community.
Filter settings
These should also be reviewed for pre-moderation. Some game companies review a lot of user-generated content before it goes live. However, in challenging times such as these, your team might not have the capacity to review too much content manually so be sure to prioritize these filters and ask if there are any pieces of content that can be reviewed after they’ve been posted (post-moderation) to spread the workload more evenly.
Implement effective sanctions
Finally, once you’ve reduced manual moderation through proactive filters and built escalation queues for the content that requires timely review, you can implement effective sanctions to establish clear consequences for repeated negative behavior. Be sure to implement sanctions that are sure to happen quickly, with a progression flow similar to this:
Without consequences, users can continue to abuse both the system and fellow gamers again and again. Don’t give users unlimited opportunities to break your community guidelines.
While staying connected is important during these uncertain times, it’s critical that moderation standards are in place to ensure positive gaming experiences for your users. As the world begins to recover from this pandemic and the gaming industry continues to surge, people will continue looking for new ways to interact with the changing world around them. One day, we’ll return to a new normal and this pandemic will set the standard for years to come. But, in the meantime, it is our responsibility to protect our gaming communities online.
For more information, please download Two Hat’s full e-book, Content Moderation in Challenging Times.
Carlos Figueiredo is Director, Community Trust & Safety at Two Hat Security.
Sponsored articles are content produced by a company that is either paying for the post or has a business relationship with VentureBeat, and they’re always clearly marked. Content produced by our editorial team is never influenced by advertisers or sponsors in any way. For more information, contact sales@venturebeat.com.