Call of Duty has long had problems with toxic chat on its voice channels, and Activision announced it is doing something about in a collaboration with Modulate.
Activision said it considered this to be a significant step forward in its ongoing battle against toxic and disruptive behavior within its voice chat feature. Activision said the AI-based Modulate’s ToxMod voice chat moderation system will debut with the release of Call of Duty: Modern Warfare III on November 10.
It’s a big win for Modulate and its ToxMod technology, which identifies toxic speech, including hate speech, discriminatory language, and harassment in real-time — and then automatically deals out the consequences. This system will complement Call of Duty’s existing moderation efforts, which include text-based filtering across 14 languages and a robust in-game reporting system.
Michael Vance, CTO at Activision, emphasized the importance of creating a welcoming and fair environment for all players. In a statement, Vance said, “There’s no place for disruptive behavior or harassment in games ever. Tackling disruptive voice chat particularly has long been an extraordinary challenge across gaming. With this collaboration, we are now bringing Modulate’s state-of-the-art machine learning technology that can scale in real-time for a global level of enforcement.”
GamesBeat at the Game Awards
We invite you to join us in LA for GamesBeat at the Game Awards event this December 7. Reserve your spot now as space is limited!
To ensure the effectiveness of the voice chat moderation system, an initial beta rollout will commence in North America on August 30 within the existing games, Call of Duty: Modern Warfare II and Call of Duty: Warzone. This will be followed by a full worldwide release (excluding Asia) coinciding with the launch of Call of Duty: Modern Warfare III. Initially available in English, support for additional languages will be added at a later date.
Activision’s partnership with Modulate represents a significant advancement in trust and safety measures within the gaming industry. Mike Pappas, CEO at Modulate, said in a statement, “We’re enormously excited to team with Activision to push forward the cutting edge of trust and safety. This is a big step forward in supporting a player community the size and scale of Call of Duty, and further reinforces Activision’s ongoing commitment to lead in this effort.”
Call of Duty’s existing anti-toxicity moderation systems have taken action against over a million accounts found to have violated the Call of Duty Code of Conduct since the launch of Call of Duty: Modern Warfare II. The implementation of updated text and username filtering technology has significantly improved the real-time rejection of harmful language. The company’s Ricochet tech has also fought cheating.
Call of Duty Voice Chat Moderation FAQ
An overview of the Call of Duty’s beta initiative to improve voice chat. Information is subject to change.
Why Has Call of Duty added voice chat moderation?
Call of Duty’s new voice moderation protects players by proactively identifying toxic behavior and enforcing the Call of Duty Code of Conduct, allowing our community to focus on the fun.
Player reporting is still valuable and available in game for players to communicate instances of toxic or disruptive behavior they may encounter; however, voice chat moderation will increase our ability to identify and enforce against bad behavior that has gone unreported.
How does Call of Duty’s voice chat moderation work?
Voice Chat Moderation is managed and operated by Activision and uses the AI-powered model ToxMod from Modulate. This system is integrated into select Call of Duty titles (see below) and is managed by Activision. Voice chat is monitored and recorded for the express purpose of moderation.
Call of Duty’s Voice Chat Moderation system is focused on detecting harm within voice chat versus specific keywords. Violations of the Call of Duty Code of Conduct are subject to account enforcement.
What types of disruptive behavior are detected?
The Call of Duty Voice Moderation system moderates based on the existing Call of Duty Code of Conduct. Voice chat that includes bullying or harassment will not be tolerated.
What are the penalties for violating the Call of Duty Code of Conduct?
Read the Security and Enforcement Policy for information regarding violations and penalties.
Which titles and regions are protected by Call of Duty’s voice chat moderation?
Initial beta rollout of the Call of Duty Voice Chat Moderation system will begin in North America only for Call of Duty®: Modern Warfare® II and Call of Duty: Warzone.
Global rollout, excluding Asia, will begin with Call of Duty: Modern Warfare III on November 10, 2023.
What languages are supported?
At initial beta rollout, the Voice Chat Moderation System will analyze voice chat in English. Following the global launch, voice chat moderation will expand to additional languages to be announced later.
Can I opt-out of voice chat moderation?
Players that do not wish to have their voice moderated can disable in-game voice chat in the settings menu.
I received a notification in-game that I was reported. How can I check my status?
The status of reports can be found in your in-game Notifications. Open the menu in the top right corner of the Home screen and navigate to Notifications (the bell icon). From there, select a Report Status Changed notification, then select View Report Status. The status screen will include the reason for and details of a report, as well as the duration of a penalty, if applicable.
Does voice chat moderation enforcement happen in real time?
Detection happens in real time, with the system categorizing and flagging toxic language based on the Call of Duty Code of Conduct as it is detected. Detected violations of the Code of Conduct may require additional reviews of associated recordings to identify context before enforcement is determined. Therefore, actions taken will not be instantaneous. As the system grows, our processes and response times will evolve.
Does this system ban “Trash-Talk” from Call of Duty?
The system helps enforce the existing Code of Conduct, which allows for “trash-talk” and friendly banter. Hate speech, discrimination, sexism, and other types of harmful language, as outlined in the Code of Conduct, will not be tolerated.
Does AI enforce violations of the Code of Conduct it detects?
Call of Duty’s Voice Chat Moderation system only submits reports about toxic behavior, categorized by its type of behavior and a rated level of severity based on an evolving model. Activision determines how it will enforce voice chat moderation violations.
GamesBeat's creed when covering the game industry is "where passion meets business." What does this mean? We want to tell you how the news matters to you -- not just as a decision-maker at a game studio, but also as a fan of games. Whether you read our articles, listen to our podcasts, or watch our videos, GamesBeat will help you learn about the industry and enjoy engaging with it. Discover our Briefings.