by Kate Anderson
Meta’s Oversight Board ruled Wednesday that Facebook and Instagram showed “patterns of censorship” by removing posts about abortion that the social media platforms claimed constituted death threats.
The board had been weighing a series of posts that were initially taken down by Meta, Facebook and Instagram’s parent company, for potential death threats against both pro-abortion and pro-life advocates before being reinstated after appeals from the users. The board took up the case in June and announced this week that Facebook had erred by removing the posts, according to the ruling.
“While Meta acknowledges its original decisions were wrong and none of the posts violated its Violence and Incitement policy, these cases raise concerns about whether Meta’s approach to assessing violent rhetoric is disproportionately impacting abortion debates and political expression,” the decision reads.
One of the posts was from a Facebook account that posted an image titled “Pro-Abortion Logic” that read “We don’t want you to be poor, starved or unwanted. So we’ll just kill you instead,” according to the board’s overview of the cases in the ruling. The second and third posts were focused on a GOP bill in South Carolina that pro-abortion advocates argued would mean a woman could face the death penalty if convicted.
As a result, the second post claimed that the bill was so pro-life that “we’ll kill you dead if you get an abortion,” and the third post argued that pro-life legislators believed it was “wrong to kill, so we are going to kill you,” according to the decision. The board determined, however, that the posts could not be “reasonably interpreted as threatening or inciting violence.”
“While each uses some variation of ‘we will kill you,’ expressed in a mock first-person voice to emphasize opposing viewpoints, none of the posts expresses a threat or intent to commit violence,” the board wrote. “Discussion of abortion policy is often highly charged and can include threats that are prohibited by Meta. Therefore, it is important Meta ensure that its systems can reliably distinguish between threats and non-violating, rhetorical uses of violent language.”
The board expressed concerns that the social media platform had not convinced them that the “errors in these cases are outliers” and not a systematic problem, according to the ruling. The board further noted that Meta needs to regularly provide data used in its “Violence and Incitement Policy” so the board is able to analyze the social media platform’s accuracy when striking content.
A spokesperson for Meta directed the Daily Caller News Foundation to a prepared statement that said the company plans to “implement the board’s decision once it has finished deliberating.”
– – –
Kate Anderson is a reporter at Daily Caller News Foundation.
Photo “Meta Sign” by Nokia621. CC BY-SA 4.0.