Home Europe 5 people defining platform liability

5 people defining platform liability

by editor

This article is part of POLITICO’s Changemakers series, looking at the players driving European policy.

The time has come for Google, Facebook, Amazon and TikTok to face new rules in Europe.

Under the current legal framework, platforms are not legally responsible for hosting illegal content, but are required to remove such material once it is flagged. That may change.

The European Commission has pledged to present by the end of the year a new framework that will “increase and harmonize the responsibilities of online platforms,” setting EU-wide requirements for how companies police illegal content online in the Digital Services Act package.

Lawmakers argue that a change is overdue. Lobbyists are on the warpath. Here are five people and organizations whose voices will matter in defining the new rules.

Thierry Breton  European commissioner for the internal market

European Commissioner in Charge of Internal Market Thierry Breton | Olivier Hoslet/EFE via EPA

Tasked by Commission President Ursula von der Leyen with leading the work on the Digital Services Act, the Frenchman manages the Commission’s digital department, which will hold the pen.

Breton told MEPs in his November confirmation hearing that the Commission would not touch the platforms’ limited liability regime. He later said that all options were still on the table. (Commission Executive Vice President for Digital Margrethe Vestager similarly backtracked and said the liability question was still open.)

In February, Breton dismissed Facebook CEO Mark Zuckerberg’s proposal to create a third status for the social media giant that would fall between telecoms provider and publisher. “They have a responsibility,” Breton told reporters after meeting Zuckerberg in Brussels. “It’s up to them to see the impact of their responsibility before we tell them so.”

The commissioner also said he is against general monitoring obligations — or the requirement to check all content before it is posted — but argues that “stricter control” is needed when it comes to hate speech, terrorist content or fake news online.

Breton won’t work on platform liability alone. Executive Vice President Margrethe Vestager, Vice President for Values and Transparency Věra Jourová and Justice Commissioner Didier Reynders will also chip in.

Christine Lambrecht — Germany’s justice minister

German Justice Minister Christine Lambrecht | Tobias Schwarz/AFP via Getty Images

Christine who? That was the most common reaction in Berlin when Germany’s Social Democrats (SPD) announced last summer that the lawyer and longtime member of parliament would take over as justice minister from her charismatic predecessor Katarina Barley, who was leaving to become a member of the European Parliament.

Today, there are few in the European tech industry who don’t know her name.  

Since taking office in June, Lambrecht has quietly introduced two pieces of legislation that would toughen Germany’s Netzwork Enforcement Act, better known as NetzDG, which governs speech online. Both of them are expected to pass parliament this spring.

The first law would force big social media companies to proactively report potentially criminal content on their platforms to law enforcement. The second aims to make it easier for users to report illegal content and challenge content decisions by the internet platforms. It also requires companies to disclose more information than was previously required in their biannual transparency reports, including details about which groups of people are particularly affected by hate speech or how companies are using artificial intelligence to detect harmful content.

Lambrecht has hinted that Berlin’s rule book could serve as a role model for other EU countries. “In many European countries, populists and extremists are rioting against democracy, dissenters and minorities,” she said. “The platforms are the same, and the racist and anti-Semitic messages are similar.”

Cédric O — France’s junior digital minister

French Digital Affairs Minister Cedric O | Yonhap/EFE via EPA

Like Germany, France is at the forefront of online content regulation and hopes to influence future EU rules.

The country adopted legislation against fake news in 2018, and its parliament is working on legislation that would require internet and social media platforms to remove flagged hate speech within 24 hours, or face fines. Though the European Commission slammed the draft law and asked France to postpone the project, Paris is moving forward — and O, the country’s junior digital minister, is planning to take the fight to Brussels.

In February, the French government announced the formation of a working group on the Digital Services Act. It will focus on “structural platforms,” as well as on platforms’ “responsibility” when it comes to online hate speech and consumer protection on marketplaces, said Finance Minister Bruno Le Maire and O.

France wants the EU to leave some margin of maneuver for national governments. “On matters of national responsibility such as the regulation of hate speech, a certain margin of appreciation must be left to states,” O recently told POLITICO. “French, Swedish and European culture don’t share the same idea of how we should balance freedom of expression and [citizens’] protection.”

Tiemo Wölken  MEP from the Socialists and Democrats group, Germany

Tiemo Wölken | European Parliament

As the European Parliament prepares to weigh in on platform liability, three committees — on legal affairs, internal market and civil liberties — will draft so-called initiative reports on the Digital Services Act. Although nonbinding, these texts will provide insight on where political groups in the Parliament stand. 

Tiemo Wölken, a German MEP from the S&D group who campaigned against the last Commission’s copyright reform, will draft the legal affairs committee’s report — and he already has some ideas. 

Wölken advocates for more transparency by internet platforms and for “more power to the judicial system rather than asking platforms to erase more content.”

“We need to look at the issues of where platforms make money. The answer is that they make money with advertising,” he told MEPs in February.

“The combination of algorithms that promote blind content that generates the most income and clicks is clearly a problem, because it means that content beneficial to societies might end up disappearing,” he added.

Wölken is not the only MEP to watch on platform liability. Alex Aguilar Saliba, also a Socialist, will lead the work in the internal market committee. The newly elected Maltese parliamentarian appears to focus more on the platforms’ market power.

Siada El Ramly director general of tech lobby group EDiMA

Siada el-Ramly | Emmanuel Dunand/AFP via Getty images

EDiMA is a lobby group representing most of the companies that would be affected by a change in the platform’s liability regime. Google, Facebook, Twitter, Amazon and smaller companies such as Etsy and Yelp are members.

In early January, EDiMA released a paper pitching an “Online Responsibility Framework” — one of the clearest wish lists to be spelled out by the tech sector since the Digital Services Act was officially announced by Commission President Ursula von der Leyen. That paper was drafted by all members over more than a year, under the leadership of Siada El Ramly.

One of the main messages: Responsibility should be distinguished from liability. Platforms need a good Samaritan clause that would hold them not liable for the content they host if they “take additional measures” to tackle illegal material, EDiMA argues.

The lobby group also advocates against general monitoring obligations.

Other aspects that should be taken into account in the discussion around tackling illegal content online include a potential EU oversight body, according to EDiMA. Breton has already dismissed the idea after meeting with Mark Zuckerberg.

Source link

Related Posts