Skip to content

‘Call Of Duty’ video game that uses AI to monitor what players say, censors ‘Toxic Speech’ | louder.news

Call of Duty, a shooter video game published by Activision, has begun using artificial intelligence to monitor what players say during online matches with the aim of flagging and cracking down on “toxic speech” more effectively, as online gaming seems poised to become the world’s new frontier. censorship

Activision recently said on its blog that Call of Duty is redoubling its fight against “hate speech” and other types of “toxic and disruptive behavior” among players in online chat rooms by enlisting the help of the AI to identify and control player behavior.

“Call of Duty’s new voice chat moderation system uses ToxMod, Modulate’s AI-powered voice chat moderation technology, to identify and enforce real-time toxic speech, including hate speech, discriminatory language, harassment and more,” the company said. in the publication.

Speech control algorithms, which online players cannot turn off, will monitor and record what they say to identify speech the company deems inappropriate for its online gaming spaces.

SOURCE LINK HERE

Leave a Reply

Your email address will not be published. Required fields are marked *

en_USEnglish