Missed the GamesBeat Summit excitement? Don't worry! Tune in now to catch all of the live and virtual sessions here.
Hank Howie, Eve Crevoshay, David Hoppe and Richard Warren came together to share some thoughts about online safety at the GamesBeat Summit.
Our world is increasingly an online one. Software like Discord makes creating online communities incredibly quick and easy. Online experiences like Fortnite aren’t just about the games anymore, but serve as hangout spots. Friend groups aren’t locally formed anymore, but potentially global.
It’s an incredible time to be alive. But it’s got some pretty significant downsides. The social pressures are different. The urge to fit in with a friend group or suffer exclusion isn’t as overwhelming; it’s easy to go to somewhere that aligns with your personal views. Solving online safety issues cropping up from those is a big deal.
That’s great for folks seeking inclusion, especially marginalized groups. It’s not so great when it facilitates toxicity in online spaces. How do you foster the kinds of community values you want and make sure the kinds you don’t want don’t develop? How do you keep people safe?
“There are norms and behaviors and ideologies that become really common in these spaces,” explained Take This’ Eve Crevoshay. “They are a small but very loud problem. That loudness means that it has become normalized.”
There are problems, and solutions
Think about it like this. If you’ve got ten people in a room and one of them says something rude? If none of the other nine object to it, it makes it seem acceptable. Worse, if one of the other nine joins in, suddenly you’ve got what looks like normal behavior.
If, then, people start looking at the room and see two very vocal people acting this way? Now it seems like this is the room for people who act like that. Now it doesn’t matter that there are still eight people in there not engaging. Now you’ve got a place where rude people hang out.
In online communities, especially in real-time chat applications like Discord, that’s a huge problem that needs to be solved.
“On Discord it’s really about how users connect to one another,” said Windwalk’s Richard Warren. “Designing really good moderation programs around what’s happening where your diehard fans hang out, and setting … a culture around self-moderation inside communities. Promoting people doing good deeds inside the community.”
To go back to that earlier hypothetical, it means convincing some of those other nine people to speak up and make it clear being rude isn’t welcome. It also means making it clear that being rude isn’t welcome, so as new people arrive they already know what is and isn’t acceptable.
If that doesn’t work, there’s always the law
For the kinds of disruptive elements that refuse to be moderated? Well, it isn’t quite the wild west anymore. Laws around the world are catching up to what happens online. Legislators and governments are finally getting a handle on how to handle things that happen primarily online.
Canada, for example, has laws for online crimes, including things like cyberbullying. The EU has the Digital Safety Act. California is working on the Age Appropriate Design Act. It’s still going to take some time for the laws to catch up to where they need to be.
But they’re getting there. Online safety is still tricky, but in the near future it won’t be.
“The days of anything goes, of sort of turning a blind eye? That’s not going to fly,” said Gamma Law’s David Hoppe.
GamesBeat's creed when covering the game industry is "where passion meets business." What does this mean? We want to tell you how the news matters to you -- not just as a decision-maker at a game studio, but also as a fan of games. Whether you read our articles, listen to our podcasts, or watch our videos, GamesBeat will help you learn about the industry and enjoy engaging with it. Discover our Briefings.