Call of Duty will now use AI “ToxMod” system to combat voice chat harassment

Robert Collins

Call of Duty®: Black Ops Cold War video game on Xbox One

Looking for more info on AI, Bing Chat, Chat GPT, or Microsoft's Copilots? Check out our AI / Copilot page for the latest builds from all the channels, information on the program, links, and more!

Call of Duty games will begin using AI in voice chat moderation by implementing a model known as “ToxMod.”

This model is from Boston company Modulate. According to the company’s site,

ToxMod understands not just what is being said, but also whether it causes harm, helping you catch magnitudes more toxicity than player reporting and speech-to-text solutions.

Billed as “the only proactive voice-native moderations solution available,” ToxMod has seen wide usage in the gaming industry including by Riot Games, publisher of online gaming hits like League of Legends, Teamfight Tactics and more.

Two Teenagers playing computer games.

Apparently the AI focuses on detecting harm within voice chat versus specific keywords.” However it “only submits reports about toxic behavior, categorized by its type of behavior and a rated level of severity based on an evolving model. Activision determines how it will enforce voice chat moderation violations” according to the Call of Duty Voice Chat Moderation FAQ.

As of right now this is being implemented in Modern Warfare 2 and Warzone in North America, and will also be used in the upcoming Modern Warfare 3.