top of page
Writer's pictureSarah Ruivivar

AI tackles toxicity in Call of Duty chats

Updated: Oct 31, 2023


Ever been in a Call of Duty voice chat and wished you could mute the toxicity?


Well, Activision is on it. The gaming giant is teaming up with Modulate to introduce "in-game voice chat moderation" to their titles, using an AI technology called ToxMod. This is a bold move towards creating a more positive gaming environment.

Call of Duty, a first-person shooter franchise, has a reputation for the negativity of its lobbies and voice chats. The fan base has even been dubbed the most negative in gaming. Activision has been battling this issue for years, and it seems AI might be the secret weapon.

ToxMod, currently in its beta rollout in North America, is active within Call of Duty: Modern Warfare II and Call of Duty: Warzone. A full worldwide release (excluding Asia) is set for November 10th, coinciding with the release of Call of Duty: Modern Warfare III.

But how does ToxMod work? It's not just about transcribing words. This clever tool analyses the nuances of each conversation, considering factors like player's emotions and volume. It can differentiate between harmful statements and playful banter, ensuring that the fun isn't sucked out of the game.

However, it's important to note that ToxMod won't be dishing out penalties. It will merely flag up bad behaviour and provide reports to Activision's human moderators. This is a crucial safeguard, as speech recognition systems can display bias, particularly with different racial identities and accents.

So, while ToxMod is a significant step towards reducing toxicity in gaming, the human touch remains vital. After all, we're all here for a good game, not a row. Let's play fair, folks!



Made with TRUST_AI - see the Charter: https://www.modelprop.co.uk/trust-ai

6 views0 comments

Comments


bottom of page