Meta has launched a content moderation tool called Hasher-Matcher-Actioner (HMA) to combat terrorist content, child exploitation, and other abusive content.
“Meta spent approximately $5 billion globally on safety and security last year and has more than 40,000 people working on it,” the company said in a statement.
Also read: WhatsApp Payments head quits
Users can label an image or video as violating and enter it into HMA. The tool then creates a unique digital fingerprint or hash for the content.
The announcement about the HMA launch comes ahead of Meta chairing the Global Internet Forum to Counter Terrorism (GIFCT) Operating Board. Meta added, “The more companies participate in the hash sharing database the better and more comprehensive it is—and the better we all are at keeping terrorist content off the internet, especially since people will often move from one platform to another to share this content.”
Also read: Twitter to discontinue Revue from next month