Home Brussels Facebook’s ‘supreme court’ struggles to set global free speech rules

Facebook’s ‘supreme court’ struggles to set global free speech rules

by editor

Two months since a group of outside experts started ruling on what people could post on Facebook, cracks in the so-called Oversight Board are already starting to show.

So far, the independent body of human rights experts, free speech supporters and legal scholars that rules on what content Facebook must take down or put back up has reversed the social media giant’s decisions in four out of its first five cases. The biggest test is yet to come: deciding whether or not to reinstate former U.S. President Donald Trump’s account after it was blocked for inciting violence around the Capitol Hill riots in January.

These initial decisions — the Trump announcement is expected by mid-April, at the earliest — highlight the unwieldy job that the Oversight Board has on its hands: to create a single set of free speech standards that apply to posts from around the world.

The group’s in-tray reads like a greatest hits compilation of thorny problems: COVID-19 misinformation in France; religious tensions in India; blackface in the Netherlands.

No matter which way the group eventually rules, it’s likely to cause controversy.

Critics say it acts too harshly in some cases or gives a free pass to content that many find offensive in others. That tension goes to a central problem underlying the Oversight Board’s work: establishing a one-size-fits-all approach to content moderation by applying global free speech norms when Facebook posts are rooted in local cultures.

“The Board is applying international human rights law to Facebook as if it was a country. That’s impossible,” Evelyn Douek, an online free speech expert at Harvard University’s Berkman Klein Center for Internet & Society, told Digital Bridge, POLITICO’s transatlantic tech newsletter.

“It’s the first body that’s using international human rights law to make content decisions,” she added. “Now that we’re getting down to brass tacks, it’s difficult.”

‘Set up to fail’

Many of those associated with the Oversight Board acknowledge there’s a tricky balance in deciding on cases whose outcomes will inevitably be felt worldwide.

Under Facebook’s rules, the Board’s decisions on specific posts under review (and those similar to them) are binding on the social network, though its suggestions on how the company’s wider community standards should be tweaked are only advisory.

So far, the tech giant has followed much of the group’s advice on how to update its content rules. The Board can only review cases related to material that has already been taken down — a limitation that critics claim hobbles the group’s ability to act. But it is expected to gain greater powers later this year.

“Everyone can see that these are not easy cases and it has been difficult to come to a final decision,” said Helle Thorning Schmidt, a former Danish prime minister and the Board’s co-chair.

Confronted with the problem of what to do with potentially harmful posts, policymakers in the European Union, the United States and beyond have proposed legislative overhauls to force social media companies to take greater responsibility.

But many countries have dragged their feet, often leaving it to the tech firms to police what should, and should not, be allowed online. The most recent plans include the European Commission’s Digital Services Act, published in December, that will include hefty fines if social media platforms do not quickly remove illegal content. Those rules will not come into force until 2023, at the earliest.

Thomas Hughes, the Oversight Board’s administrative director — who does not make content rulings — said the five-person panels reviewing each decision often relied on local experts to hash out the intricacies of how posts were viewed within specific cultures.

But, he added, the group’s goal was to build a body of legal precedents that could steer Facebook’s decisions on platform-wide free speech rules, even if that meant the experts made decisions that went against national laws. While he did not cite any specific countries, Russia and Turkey have passed rules banning people from publishing online posts critical of local leaders.

“I foresee that we will come to the point in which the Board overturns a piece of content which maybe runs contrary to a country’s national legislation, but which people feel is not compliant with international human rights standards,” he said. “That will be an interesting moment.”

Not everyone is supportive of the Oversight Board’s work.

Roger McNamee, an early investor in Facebook who has become one of its most vocal critics, said the Board was not independent from the tech giant, and did not have the power or resources to tackle the hate speech and other harmful content posted online each day. The American investor said that because the Board could not force changes to the tech giant’s community standards, it would not be able to effect meaningful change.

“It’s a hopelessly inadequate and deeply compromised initiative aimed to address specific issues, but do nothing about the wider problem,” said McNamee, who co-founded the Real Facebook Oversight Board, a campaigning group critical of its namesake. “It’s a group of well-intentioned people who have been set up to fail.”

Local problems, global answers

Much of the world’s attention will soon focus on whether Trump’s account will be reinstated. But another case — announced on the same day Trump’s suspension was referred to the Board — goes to the hard of the difficulty setting global rules for free speech.

Late last year, Facebook removed a short video uploaded by a Dutch user that included “Zwarte Piet,” or Black Pete, a controversial character that belongs to a local tradition in which people dress up in blackface during the December holiday of Sinterklaas.

The company said the post broke its hate speech standards, which were updated to include blackface following the Black Lives Matter protests last year. The user appealed, saying the video should be reinstated because it was dedicated to the individual’s child who wanted to see it back on Facebook.

Even in the Netherlands, Zwarte Piet is a divisive subject. Some see the character as inherently racist. Others see it as part of a longstanding Dutch tradition. The Board must now pick a side in a decision that will have an impact far beyond the Netherlands’ borders.

The group has already had to grapple with similar controversies.

In its first rulings, the Board ordered Facebook to reinstate posts that criticized Muslims or quoted Joseph Goebbels, the Nazi propagandist, because it believed such content did not break the social network’s rules, despite causing outrage among online users.

“There’s not a global freedom of expression to be found, even if the global internet companies want to find one,” said Will Moy, chief executive of Full Fact, a British fact-checking nonprofit that works with Facebook to debunk misinformation. “Some of these policies need to be country specific. Who should be making those decisions?”

This article is part of POLITICO’s premium Tech policy coverage: Pro Technology. Our expert journalism and suite of policy intelligence tools allow you to seamlessly search, track and understand the developments and stakeholders shaping EU Tech policy and driving decisions impacting your industry. Email [email protected] with the code ‘TECH’ for a complimentary trial.

Source link

Related Posts