As of November 10th, the release date of ‘Call of Duty: Modern Warfare 3,’ all players’ voice chats will be monitored by artificial intelligence (AI). This AI-powered technology, called ‘Toxmod,’ aims to identify real-time toxic speech in multiplayer games in order to create a more friendly gaming environment. The AI chat moderator will flag hate speech, discriminatory language, harassment, sexism, and bullying, which are all against the code of conduct of Call of Duty.
The AI does not listen for specific words, but rather focuses on the intent behind players’ speech. It aims to distinguish between harassment or bullying and friendly banter. Toxmod’s goal is to detect harm in voice chat by considering factors such as emotion, speech acts, and listener responses. However, the effectiveness of Toxmod in moderating real-life speech, particularly on a large scale, remains to be seen. If Toxmod identifies a violation of the code of conduct, it will report the incident, and further enforcement will be determined by humans at Activision.
Players who do not wish to be monitored by the AI can disable voice chat. Although AI chat moderation will be fully implemented on November 10, a beta rollout is already active in ‘Call of Duty: Modern Warfare 2’ and ‘Call of Duty: Warzone,’ which started on August 30.
The whytry.ai article you just read is a brief synopsis; the original article can be found here: Read the Full Article…