Call of Duty games will begin using AI in voice chat moderation by implementing a model known as “ToxMod.”
This model is from Boston company Modulate. According to the company’s site,
ToxMod understands not just what is being said, but also whether it causes harm, helping you catch magnitudes more toxicity than player reporting and speech-to-text solutions.
Billed as “the only proactive voice-native moderations solution available,” ToxMod has seen wide usage in the gaming industry including by Riot Games, publisher of online gaming hits like League of Legends, Teamfight Tactics and more.
Apparently the AI focuses on detecting harm within voice chat versus specific keywords.” However it “only submits reports about toxic behavior, categorized by its type of behavior and a rated level of severity based on an evolving model. Activision determines how it will enforce voice chat moderation violations” according to the Call of Duty Voice Chat Moderation FAQ.
As of right now this is being implemented in Modern Warfare 2 and Warzone in North America, and will also be used in the upcoming Modern Warfare 3.