Facebook will soon hold public consultations in Delhi to improve its community standards policy in an effort to provide transparency on how the social networking giant reviews and blocks content.
On Tuesday, Facebook also released a rule book for the types of posts it allows on its social network, giving far more details than ever before on what is permitted on subjects ranging from drug use and sex work to bullying, hate speech and inciting violence.
Community guidelines
Talking to Indian media over a video conference, Monika Bickert, Facebook’s vice-president of product policy and counter-terrorism, said the company is for the first time making its internal set of community guidelines public to offer more transparency.
Facebook considers changes to its content policy every two weeks at a meeting called the ‘Content Standards Forum’. However, Bickert pointed out that this is not a newly formed group and has been in existence for over five years.
“We’ve had these meetings for more than five years and we meet every two weeks. But the meetings have evolved over time,” Bickert said.
According to Bickert, the way these group meetings take decision on policy changes has changed dramatically in the last couple of years wherein not just internal stakeholders but even outward facing groups are called in for handling complex changes.
“We now have a team called stakeholder engagement and their job is to maintain relationships with hundreds of civil society groups and academics so that when we are discussing about say animal rights or child safety, they know who to reach out to for a broader perspective,” Bickert said.
“For years, we’ve had Community Standards that explain what stays up and what comes down. Today, we’re going one step further and publishing the internal guidelines we use to enforce those standards. And for the first time, we’re giving you the right to appeal our decisions on individual posts so you can ask for a second opinion when you think we’ve made a mistake,” Bickert said.
New ‘appeals’ process
Facebook is also bringing in a new ‘appeals’ process to rectify the errors made by Facebook’s AI tools or reviewers in blocking or allowing a post to appear on the platform.
Appeals will initially be rolled out in phases and available to posts that were removed for nudity/sexual activity, hate speech or graphic violence.
Facebook has been under fire from several governments recently when Cambridge Analytica was found boasting about how it used the platform to sway not just US elections but even certain state elections in India by manipulating the content that people see on Facebook.
Privacy advocates while welcoming the move to be more transparent say this only highlights how much power a single company can have over public discourse.
“Transparency is the first step towards engagement and improvement. Reeling from the recent fiasco, it’s a welcome step that this is happening. Considering, engagement from Facebook India is controlled and barely existent, the appeals process will be useful for many complainants. But please remember nothing is changing here, the standards are merely made public. At a macro level, it further emphasises how much of public discourse is now in private hands. How much power the cos, exercise,” said Mishi Chaudhary, a technology lawyer and digital rights activist.
Chaudhary said Facebook could still continue to ban posts without giving any explanation to the users at it holds no liability to do so being a private company. However lack of trust may impact its advertising business, which would be of concern to Facebook.
“Without the users, the advertising business collapses. This is a good exercise to engage but is only bringing transparency and a promise of improvement. Whether or not it happens, only time will tell,” Chaudhary said.
In May, Facebook will launch Facebook Forums: Community Standards, a series of public events in Germany, France, the UK, India, Singapore, the US and other countries where the company will get people’s feedback directly. Bickert said the company plans to bring this to New Delhi in June or July this year.
Bickert said the company is also testing out new tools to prevent the spread of fake news.
“We are also testing a feature called ‘article context’ where we are putting a little icon on news stories that may not be accurate. You can click on that icon and get information about who that publisher is or who is behind that speech so that you can make a more informed choice,” she said.