Home Brussels How Europe’s privacy laws are failing victims of sexual abuse

How Europe’s privacy laws are failing victims of sexual abuse

by editor

Emma Holten was a teenager living in Denmark when her emails were hacked and her private information posted online.

The stolen files included intimate images of her — some from when she was 16. Ten years later, Holten is in her late twenties and working as a policy consultant for Denmark’s Women’s Council. Despite her best efforts to get them scrubbed, those photos are still on the internet.

“That first day was incredibly painful, but it was also mostly confusing,” she said. “It was only after that I really understood the scope of what was happening and how big of an impact it would have on my life.”

The posting of someone’s sexually explicit images online without their permission is a form of image-based sexual abuse better known as “nonconsensual pornography” or “revenge porn.” And when the right to erasure came into force as part of the EU’s General Data Protection Regulation (GDPR) laws in 2018, it seemed to offer better protections to victims.

The law allows for internet users to go directly to a platform and request the delisting of “inaccurate, inadequate, irrelevant, or excessive” personal information — in theory, giving individuals authority over which bits of themselves are online. But that hasn’t panned out.

Two years after GDPR was introduced, nonconsensual pornography is very much on the rise. Britain’s state-funded Revenge Porn Helpline registered a record 250 cases in April 2020 — double the number from the year before. The Irish charity Refuge confirmed that revenge porn is increasing across Ireland too, where images of 140,000 women were leaked last year. 

So why is the EU’s lauded data protection regime failing to protect victims of online sexual abuse?

Lawyers and researchers who spoke to POLITICO described GDPR as not fit for purpose when it comes to protecting these victims’ data. With the onus being on users to chase their private images across the Internet, to contact the individual data controllers and persuade reluctant platforms to act on the flagged material, the process can be exacting — and expensive if one chooses to outsource the work to a lawyer.

The right to erasure is “incredibly hard to implement” for nonconsensual pornography victims, according to obscenity lawyer Myles Jackman. Legal researcher Marthe Goudsmit said the process is “demanding.”

“The procedure can be lengthy, expensive and it is not always successful, depending on the website’s willingness to collaborate — if they even reply to your C&D [cease and desist] letter,” said Giorgio Patrini, CEO and lead scientist of Sensity, a visual threat intelligence company that defends both individuals and organizations from the spreading of “deepfake” content, another form of online sexual abuse whereby images and videos can be manipulated to be made explicit.

In addition, some platforms are glacial in their approach to take down videos of nonconsensual pornography. The messaging app Telegram is known to be particularly slow to remove such content. A recent report by Italian advocacy group PermessoNegato found the platform to be “refractory” and “complacent” in its response to reports of image-based sexual abuse. Some of the most popular pornography sites — Czech-owned XVideos and Xnxx — continue to profit from nonconsensual deepfake content.

Telegram, XVideos and Xnxx did not respond to POLITICO’s request for comment.

Until recently, the world’s most popular adult website Pornhub, whose owners are headquartered in Luxembourg, also had explicit videos of women on its platform without their consent. After a report in the New York Times, the website last month committed to ending the practice and better police its platform.

“There is no incentive for adult platforms to remove damaging deepfake videos,” added Patrini. “Those videos are a source of monetization. Currently, no legislation mandates that publishers have to proactively remove damaging deepfake content.”

GDPR’s ‘theoretical rights’

With GDPR having little impact, some EU countries are taking unilateral measures. After reports of the leak, Ireland pushed through stringent legislation on online abuse in December; offenders could face up to 10 years in prison as well as unlimited fines. Belgium introduced severe penalties last year, including a prison sentence and a fine of up to €15,000.  

Belgium’s state-run Institute for the Equality of Women and Men (IEFH) was also empowered to provide legal assistance to victims of nonconsensual pornography — but it prefers not to “fall back” on EU law to do so. Instead of the “grand theoretical rights” offered by GDPR, explained the Institute’s legal expert Phedra Neel, the IEFH prefers the more “practical” provisions in the national legislation or, if necessary, the 2000 e-Commerce Directive, intended to regulate the responsibility of certain platforms for the uploaded content.

Even then, it’s worth noting the worldwide nature of the web. Neel explained the “geographical limitations” of such measures, depending on the platform’s classification. The Court of Justice of the EU determined that Google only has to remove links for EU member countries, for example, whereas Facebook can be ordered to remove content globally.

What’s more, when victim-turned-activist Holten tried to apply Danish data protection laws to her own situation, she ran into difficulties because the country hosting the site did not have similar legislation.

Holten also said countries don’t cooperate and don’t share information. “For a long time, police could just push it over to someone else and be like ‘this person is sitting in Hungary or in Italy,’” she explained, “and that means that they’re outside of our jurisdiction and it’s a different set of laws — even though I was suffering in Denmark from these people’s crimes.”

The U.K.’s ICO said: “While we haven’t had reason to work with other data protection jurisdictions over a specific case in this context, we do have excellent relationships with them through the Global Privacy Assembly, Common Thread Network and various Memorandums of Understanding.”

The European Data Protection Board (EDPB), the network of national data protection regulators, was also unable to comment on data protection in this context as it has “yet to issue guidance on this topic specifically.” The EDPB did note, however, that it is “currently developing further guidance on data subject rights.”

More rules to come

Patrini said the EU should do more. “It is a crime to impersonate others by forging their signature, and to open a bank account in their name without their knowledge,” he added. “So it should be when my face is abused in digital media.”

Some lawmakers are aware of the problem. In the wake of the Ireland leak, Irish MEP Maria Walsh told a European Parliament hearing: “This event in Ireland is a clear example of blind spots which exist in member states when it comes to cyber violence,” she said, prior to Ireland’s criminalization of the offence in December.

A European Commission spokesperson pointed to the recently proposed Digital Services Act (DSA), which would regulate how tech companies police their platforms, the Gender Equality Strategy 2020-2025, and an initiative to fight child abuse.

The DSA is set to have the most concrete impact on victims, explained the spokesperson. The rules include “harmonised measures on how to treat content which is illegal according to EU or national laws,” including cross-border removal orders between authorities and services established in other EU countries.

While activists welcome cross-border removal orders, under the proposed rules they may only be applicable to the territory of the issuing country. That means that while the images may no longer appear in their home country, there’s nothing to stop that image being seen in other countries where nonconsensual pornography isn’t actively criminalized — over two-thirds of the EU’s 27 countries have no specific laws against the practice.

Still, Holten said, it’s a start.

“I think being a victim in 2011 and being a victim in 2020 really is different,” she said. “There are some options available to victims today, legally, technologically, culturally, that were not available to me at that time at all. The activism has worked.”

Want more analysis from POLITICO? POLITICO Pro is our premium intelligence service for professionals. From financial services to trade, technology, cybersecurity and more, Pro delivers real time intelligence, deep insight and breaking scoops you need to keep one step ahead. Email [email protected] to request a complimentary trial.

Source link

Related Posts