For those of you who enjoy battling it out in Call of Duty’s multiplayer mode, it’s time to be more cautious and contribute positively. Activision is rolling out a new voice chat moderation feature for its current and upcoming Call of Duty games. This AI-based tool will continuously monitor players’ communication during matches, taking action against toxic behavior.
Introducing ToxMod for Call of Duty Modern Warfare 3
In an effort to tackle toxicity within the game, Call of Duty’s dedicated anti-toxicity team is introducing ToxMod. What precisely is it, and how does it function? According to Activision, ToxMod is an AI-powered voice chat moderation technology developed by Modulate. It’s built on advanced machine learning and will initially support English, with plans to include more languages later on. The full release of this new AI-based moderation software is scheduled for Call of Duty: Modern Warfare 3 later this year.
The goal of ToxMod is to enhance player safety. It actively analyzes conversations to identify what qualifies as “toxic” behavior. This enables Call of Duty’s multiplayer moderators to promptly respond to reports, especially those flagged by Modulate’s ToxMod AI moderation system. Activision claims that ToxMod can identify various forms of toxic behavior, including:
- Hateful speech
- Bullying
- Discriminatory language
- Harassment
Will ToxMod be Available in CoD Warzone & Modern Warfare 2?
Yes, Activision plans to introduce ToxMod through an initial beta rollout in their existing games, including Call of Duty: Modern Warfare 2 and Call of Duty: Warzone. The beta testing phase commenced on August 30, 2023.
So, if you’re an active Call of Duty multiplayer player, you’ll soon experience improved matchmaking moderation with ToxMod. Player reports can now be addressed more efficiently, as ToxMod provides all the necessary context for each report.
Since the launch of CoD: Modern Warfare 2, Activision reports having taken action against over 1 million toxic players. Encouragingly, after the initial actions, 20% of players exhibited improved behavior. Those who repeated their toxic behavior faced stricter consequences. The developers believe their efforts to reduce in-game toxicity are yielding positive results.
With the implementation of ToxMod, we can hope for better interactions within the Call of Duty online multiplayer community. Xbox has also introduced a new Strike system recently to address toxicity. Microsoft aims to minimize toxic behavior across the entire Xbox network with this system in place.
It’s impressive to see game developers continually incorporating new technologies to combat the issue of online toxicity. With advancements like ToxMod, we might witness a significant decrease in toxic behavior. What are your thoughts on the upcoming ToxMod update for Call of Duty games? Share your opinions in the comments below.
0 Comments