Facebook to take action against users repeatedly sharing misinformation

Reuters Updated - May 27, 2021 at 09:24 AM.

1.3 billion fake accounts have been taken down between October and December

FILE PHOTO: A 3D-printed Facebook logo is seen placed on a keyboard in this illustration taken March 25, 2020. REUTERS/Dado Ruvic/Illustration/File Photo

Facebook Inc. said on Wednesday it would take "stronger" action against people who repeatedly share misinformation on the platform.

Facebook will reduce the distribution of all posts in its news feed from a user account if it frequently shares content that has been flagged as false by one of the company's fact-checking partners, the social media giant said in a blogpost.

It added that it was also launching ways to inform people if they are interacting with content that has been rated by a fact-checker.

False claims and conspiracies have proliferated on social media platforms, including Facebook and Twitter, during the COVID-19 pandemic.

"Whether it's false or misleading content about COVID-19 and vaccines, climate change, elections or other topics, we're making sure fewer people see misinformation on our apps," the company said in a statement.

Earlier this year, Facebook said it took down 1.3 billion fake accounts between October and December, ahead of an inspection by the U.S. House Committee on Energy and Commerce into how technology platforms are tackling misinformation.

Published on May 27, 2021 03:54