12 days before election, Meta oversight board frets about political speech moderation
With 12 full days left until the 2024 U.S. presidential election, there are still concerns about if Meta can properly moderate political content on its platforms.
Two months ago, a Facebook user put the faces of presidential candidate Vice President Kamala Harris and her running mate Minnesota Governor Tim Walz on top of Jim Carrey and Jeff Daniels’ characters in the Dumb and Dumber movie poster. The poster features Carrey and Daniels’ characters grabbing each other’s nipples — you know the one. The user then captioned it with the emojis “🤷♂️🖕🖕.”
Meta removed the post because it went against the platform’s Bullying and Harassment Community Standard, specifically because that standard doesn’t allow “derogatory sexualized photoshop or drawings.” The user appealed Meta’s decision, which moved the decision up to Meta’s Oversight Board. After that, Meta “determined its removal was incorrect, restoring the post to Facebook.”
Meta’s oversight board is a group with more than 20 members, including academics, policymakers, and journalists, and this group makes content moderation decisions on Facebook and Instagram. The board wrote that Meta’s handling of this post gives them “serious concerns” about Meta’s ability to moderate political content on its platforms.
“This post is nothing more than a commonplace satirical image of prominent politicians and is instantly recognizable as such,” the board wrote. It continued:
In the context of elections, the Board has previously recommended that Meta should develop a framework for evaluating its election integrity efforts in order to provide the company with relevant data to improve its content moderation system as a whole and decide how to best employ its resources in electoral contexts.
It then referred to a previous case surrounding a general in Brazil who, amid that country’s 2022 elections, called on activists to “hit the streets” — a remark tied to protesters storming Brazil’s National Congress and Supreme Court. The board at the time made the opposite determination: that Meta failed by not removing the general’s remarks in time to avoid adding to a volatile situation. The board, apparently, wants to see progress on the creation of a system of rules that would cover that past false negative, as well as this false positive.
Its statement continued:
Meta has reported progress on implementing this recommendation. Nonetheless, the company’s failure to recognize the nature of this post and treat it accordingly raises serious concerns about the systems and resources Meta has in place to effectively make content determinations in such electoral contexts.
The board wrote that it sees an “overenforcement of Meta’s Bullying and Harassment policy with respect to satire and political speech in the form of a non-sexualized derogatory depiction of political figures.” It says this overenforcement could “lead to the excessive removal of political speech and undermine the ability to criticize government officials and political candidates, including in a sarcastic manner.”