(Reuters) – Meta Platforms Inc’s independent oversight board said on Thursday that Facebook should not have removed a newspaper article about the Taliban it considered positive, backing users’ freedom of expression and saying the tech company relied too heavily on automated moderation.
Meta found the Taliban’s post announcing that schools and colleges in Afghanistan for women and girls would reopen in March violated Facebook policies because it “praised” entities deemed to “engage in serious harm offline.” .
The company restricted the newspaper’s access to certain Facebook features after deleting the post.
The newspaper appealed the decision, after which the message was sent to a special moderation queue, but was never reviewed, according to the Supervisory Board.
The oversight board said Meta’s decision to remove the post was inconsistent with Facebook’s policies because they allow reporting on such organizations, and the company reversed its decision after the board selected the case.
“The Board felt that Meta should better protect users’ freedom of expression when it comes to reporting on terrorist regimes,” the Oversight Board said. (https://bit.ly/3U8aUnP)
“By using automated systems to remove content, Media Matching Service banks can amplify the impact of incorrect decisions made by individual human reviewers,” he added.
Meta’s Oversight Board, which includes academics, rights experts and lawyers, was created by the company to adjudicate a small slice of thorny content moderation appeals, but it can also provide advice. on site policies.
(Reporting by Eva Mathews in Bengaluru; Editing by Shounak Dasgupta)