Meta has launched a content moderation tool called Hasher-Matcher-Actioner (HMA) to combat terrorist content, child exploitation, and other abusive content.
“Meta spent approximately $5 billion globally on safety and security last year and has more than 40,000 people working on it,” the company said in a statement.
Also read: WhatsApp Payments head quits
Users can label an image or video as violating and enter it into HMA. The tool then creates a unique digital fingerprint or hash for the content.
The announcement about the HMA launch comes ahead of Meta chairing the Global Internet Forum to Counter Terrorism (GIFCT) Operating Board. Meta added, “The more companies participate in the hash sharing database the better and more comprehensive it is—and the better we all are at keeping terrorist content off the internet, especially since people will often move from one platform to another to share this content.”
Also read: Twitter to discontinue Revue from next month
Comments
Comments have to be in English, and in full sentences. They cannot be abusive or personal. Please abide by our community guidelines for posting your comments.
We have migrated to a new commenting platform. If you are already a registered user of TheHindu Businessline and logged in, you may continue to engage with our articles. If you do not have an account please register and login to post comments. Users can access their older comments by logging into their accounts on Vuukle.