Facebook employees warned the company for years that it was failing to effectively respond to abusive and harmful content, particularly in developing countries, according to leaked documents and interviews with former employees.
Facebook knows that it hasn’t hired enough workers who have both the language skills and knowledge of local events needed to find posts from users in a number of developing countries that violate the company’s rules, Reuters reported, citing leaked company documents.
And the artificial intelligence systems Facebook uses to find such content frequently aren’t able to perform well enough, the documents show.
At the same time, the company has not made it any easier for users globally to flag posts that they themselves think violate the platform’s rules, according to the documents, which were leaked by whistleblower Frances Haugen.
Redacted versions of the documents, which were shared in formal complaints with the Securities and Exchange Commission as well as Congress, were then shared with a consortium of news organizations, including Reuters, which are expected to publish a slew of stories Monday.
In a review posted to Facebook’s internal message board last year, one employee reported “significant gaps” in certain countries at risk of real-world violence, especially Myanmar and Ethiopia.
Facebook’s international reach has been key to its continued growth, especially as calls for regulation in countries like the US and UK have mounted.
The company now operates in more than 190 countries and boasts more than 2.8 billion monthly users who post content in more than 160 languages.
But Facebook’s failure to effectively police content in some parts of the world shot to global attention in 2018, after experts from the United Nations who were investigating the ethnic cleansing against Myanmar’s Rohingya Muslim minority said Facebook was widely used to spread hate speech toward them.
As a result, the company said it would increase staffing in certain countries, like Myanmar, with workers who were more aware of local events and spoke the local language so they could monitor content, a former employee told Reuters.
But as recently as this month, Reuters found Facebook posts in Amharic, one of Ethiopia’s most common languages, describing different ethnic groups as the enemy and issuing death threats.
Thousands have been killed and millions displaced in a nearly year-long conflict in the country between the Ethiopian government and rebel forces in the Tigray region.
Internal reviews also indicated that Facebook has had problems with content moderation across much of the Middle East, India, other parts of Asia and North Africa, according to Reuters.
And addressing the issue was not a priority for company management, three former Facebook employees who worked for the company´s Asia Pacific and Middle East and North Africa offices in the past five years told Reuters.
Ashraf Zeitoon, Facebook’s former head of policy for the Middle East and North Africa, who left in 2017, described the company’s approach to global growth as “colonial,” echoing claims made by Haugen, who said the company put profits before people.
Facebook spokesperson Mavis Jones said in a statement that the company has native speakers around the world who review content in more than 70 languages.
The company also has experts in humanitarian and human rights issues who are working to stop abuse on Facebook’s platform in places where there is a heightened risk of conflict and violence, she said.
“We know these challenges are real and we are proud of the work we’ve done to date,” Jones said.