Saturday, 18 May 2024

Opinion | Facebook’s Supreme Court Needs a Broader Mandate

Facebook’s Oversight Board, the handpicked band of outsiders charged with ruling on thorny issues at the company, handed down its first opinions after months of deliberation late last month.

The results were underwhelming.

The board, which some have likened to a Facebook Supreme Court, ordered the company to restore four posts it said shouldn’t have been removed under Facebook’s rules (there was a fifth removal that the board upheld). The posts went back up — but almost four months after they had been pulled down.

The board has limited power. If it decides that a post should be restored or removed, Facebook has to comply. But the underlying policy reasons for their decisions are treated as advisory.

Facebook could, for instance, choose to narrowly interpret the rulings to serve its own business purposes. Worse, the very existence of the board could give Facebook cover to continue operating pretty much as it wishes, without instituting broad reforms.

“If Facebook wants to minimize the impact of the board’s rulings, it will be able to find differences in posts that we didn’t rule on,” said Michael McConnell, a board co-chair and Stanford Law School professor, in an interview, though he is “somewhat optimistic” the company would not do so.

John Samples, a board member and vice president at the Cato Institute, said it’s his expectation that Facebook would try to apply the board’s underlying policy behind its ruling to other posts, but “the Oversight Board does not have the bureaucracy to figure out whether they’re carrying that out.”

The board will also take on only a vanishingly small number of cases each year — it initially selected five out of 150,000 submissions — meaning the vast majority of other challenges will fall to Facebook’s own moderators. For now, the board is only reviewing cases where posts or accounts may have been improperly taken down. It won’t yet look at the trickier question of what content should no longer be left up.

The restoration of posts from October and November doesn’t feel quite like the sea change many have called for at Facebook, especially in light of the ample evidence showing the site was used to help organize the incursion at the Capitol.

Facebook needs to do more to demonstrate it is taking calls for reform seriously. The company should institute broader policy changes informed by the board’s findings, and added transparency around which posts are affected. It should also quickly expand the mandate of the board to include decisions about which posts and, even accounts, should be removed, rather than just which should be restored.

The Oversight Board, made up of 20 academics, lawyers, writers, politicians and other heavyweights from around the globe, approached its task with evident rigor. Its rulings on the posts each ran thousands of words, citing Facebook’s dense community standards guidelines as if they were case law. The group overruled Facebook in four of the five cases, involving matters of nudity, coronavirus misinformation, Nazi propaganda and hate speech.

In its findings, the board said Facebook had improperly removed posts from a user criticizing the French government for withholding an alleged coronavirus cure; one attempting to quote the Nazi official Joseph Goebbels; and one from a user in Myanmar disparaging Muslims. The board also found that an Instagram post showing nipples in the context of cancer awareness should not have been removed, a decision Facebook itself had previously reversed. The Oversight Board agreed with Facebook on its removal of a posting with a Russian slur for Azerbaijanis.

Mr. McConnell, of the board, said Facebook could decide that, for instance, posts displaying nipples in an anatomical, rather than sexual, context might still warrant removal, even when appearing to fall within the context of the Oversight Board’s intent.

“There is no obligation for them to accept any or all of our policy recommendations,” he said.

Facebook acknowledged as much, calling the board’s policy recommendations “advisory,” and saying that similar posts as those ruled on would be removed when “feasible.”

“I can only reassure you that it’s the team’s intention to follow the recommendations of the Oversight Board,” said Nick Clegg, a spokesman for Facebook. The company was given 30 days to address the policy recommendations.

The board also suggested that the company offer users more clarity about why their posts were removed, and about the ability to appeal decisions made by software to human moderators. But those, too, the company can choose to ignore.

How Facebook responds takes on great importance in just under three months when it rules on whether it was appropriate, under the company’s rules, to have banned Donald Trump from its sites. That could have vast implications for how Facebook handles political speech, particularly when world leaders spew racial invectives or misinformation about the coronavirus or voting.

But with Facebook’s tolerance for politicians and other influential users telling outright lies, it is easy to imagine the board’s ruling on Mr. Trump being narrowly applied so that, for instance, Brazil’s president, Jair Bolsonaro, can continue promoting unproven coronavirus treatments. The company bent over backward to accommodate Mr. Trump until it was politically expedient to turn on him.

Still, the very existence of the board marks a meaningful step forward for Facebook, which has resisted outside pressure to change. The company has the opportunity to show it truly wants to reform itself. For the good of democracy and decency, let’s hope it seizes it.

The Times is committed to publishing a diversity of letters to the editor. We’d like to hear what you think about this or any of our articles. Here are some tips. And here’s our email: [email protected].

Follow The New York Times Opinion section on Facebook, Twitter (@NYTopinion) and Instagram.

Source: Read Full Article

Related Posts