50.50: Feature

EU’s amended Digital Services Act fails to better regulate ‘revenge porn’

European politicians failed to tackle digital image-based abuse, especially on porn sites, say survivors of ‘revenge porn’

Jelena Prtoric photo.jpg
Jelena Prtorić
13 May 2022, 12.01am

Although several countries have criminalised non-consensual image sharing, it is still hard for women to seek justice due to the crime’s transnational character | Illustration by Inge Snip for openDemocracy. All rights reserved

In 2011, nude photos that 19-year-old Emma Holten took for her then boyfriend were stolen and shared on the internet, often with private information also included, such as her address, phone number and link to her Facebook account. When she reported this to the police in Copenhagen, where she lives, she soon realised that they had neither the infrastructure nor the technical expertise to find the perpetrator(s).

They even told her that the images “weren’t so bad”. “I guess this means that I wasn’t engaging in sex or that I wasn’t fully naked but topless,” she recalled.

More than a decade after they were first leaked, Holten still doesn’t know who stole her photos. Even today, the images keep resurfacing on different websites. Holten’s experience led her to become an activist against gendered abuse, and particularly digital image-based abuse, which tends to be tagged with the name ‘revenge porn’.

Digital image-based abuse encompasses a range of abusive behaviours, from taking and/or sharing somebody’s intimate images without their consent to ‘deepfakes’: fake, artificially made, pornographic videos.

Get our free Daily Email

Get one whole story, direct to your inbox every weekday.

Although there is no comprehensive data as to how widespread the phenomenon is, a 2021 report by the UK-based charity Revenge Porn Helpline (RPH) said that the number of reports they received doubled in 2020, reaching a record high of 3,146 cases. Between 2015 (when RPH was founded) and 2020, the helpline removed nearly 200,000 pieces of content.

Last year, Hate Aid, a Berlin-based charity offering support to victims of digital violence, surveyed 2,000 people aged 18 to 80 from across the European Union, and found that 30% of women polled feared their intimate images might end up being shared online.

Although several countries have criminalised non-consensual image sharing, it is still hard for women to seek justice due to the crime’s transnational character – the perpetrators, platforms and servers can all be based in different countries.

Digital Services Act: disappointing

Many activists – and survivors – hoped that the recent revision of the Digital Services Act (DSA), the legislative framework governing the digital landscape in the European Union, would provide better regulation of some of the platforms where this kind of content is shared. However, the agreement reached in April left many women’s organisations disenchanted.

The DSA badly needed updating, since it had remained largely unchanged since 2000. In the text presented to the European Council, article 24b outlined regulations governing pornographic websites. Requirements included phone and email address verification for users uploading content; and for websites to ensure that their content is moderated by humans (instead of AI). Victims of image abuse would be able to ask for content removal without providing personal information. But article 24b was not adopted.

“I am still shocked,” said Alexandra Geese, a German MEP and the author of the 24b amendment. After lengthy negotiations, it all came down to trade-offs between the parliament and the council. “The council didn’t want to accept too many of the parliament's provisions. Basically, [they said] you can get four or five, but not everything. I was the only one defending 24b until the end,” Geese said.

The parliament decided to prioritise measures related to consumer protection or those allowing the European Commission and member states access to the algorithms of very large online platforms. A source linked to the council side of negotiations (who wanted to stay anonymous) said it seemed “difficult to specify further the measures to be taken on this type of issue [revenge porn] in a framework that will apply to all platforms.”

Danish MEP Christel Schaldemose – who led the negotiations on behalf of the European Parliament – didn’t reply to openDemocracy’s request for comment.

Removing content from porn platforms

A key problem for victims of digital image abuse is that it can be very difficult to get content removed from social networks, online forums and messaging apps. Navigating the content removal policies of porn platforms is particularly tricky.

Anna Nackt (a pseudonym – nackt means ‘naked’ in German) knows this only too well. In 2019, this 27-year-old German, based in Berlin, discovered that a dozen photos of herself had been published on various porn sites without her knowledge or consent, together with links to her Facebook profiles.

Trying to remove the photos turned out to be a long and frustrating process. Some websites did not have a contact or form for removal, so she had to dig around to find the right email address.Some platforms demanded her personal details. “Sometimes you don’t get a notification that they received your message, or that the content was taken down. I just kept refreshing the sites regularly,” she explained.

To help other people in similar situations, she launched the website Anna Nackt. “I wanted to put up some basic information, like what to do if porn platforms don’t reply, or how you can set up an anonymous email address to report the content,” she said.

She added: “It is really frustrating to know that this amendment [to the DSA] wasn’t adopted. This gives us, who are affected, the impression that we are not important.”

Sex workers’ perspective

But not all those affected regret the fact that article 24b wasn’t adopted. “We are for better regulation, but don’t see the point of introducing mandatory phone number and email registration, says Yigit Aydin from the European Sex Workers’ Rights Alliance. ESWA, which advocates for the recognition and safety of sex work, represents 101 member organisations across 30 countries in Europe and central Asia.

“Porn workers are already asked for more than a phone number. Most platforms ask for a copy of your passport, for age verification. We are against more data collection because sex work is already stigmatised. This data could be leaked and weaponised,” Aydin said.

He described cases where photos of sex workers were leaked without their consent and they were blackmailed for sex. In most cases, they can’t report such abuse to the police because sex work is illegal in their countries.

He also regrets that sex workers were not consulted about the proposed changes to the DSA. “We need to be consulted so that, while trying to get better regulation, we don't harm the most marginalised people in the process,” he said.

Other solutions

The revised DSA does address some of the concerns of the victims of digital image-based abuse. Platforms will be required to remove “without undue delay” the “offending content” reported by users (although anonymous reporting is not allowed).

Other regulations apply to “very large platforms” – those with more than 45 million users in Europe – which probably includes the largest porn sites. They will need to carry out an annual assessment of the risks their activities pose to rights such as human dignity, data protection or diversity of expression; and to analyse the impact of their services on public discourse, mental and physical being and gender-based violence. They will also have to disclose how many staff they use for content moderation and how they are trained.

Josephine Ballon, lawyer and head of legal at Hate Aid, highlights the fact that thanks to the DSA talks, the problem of image-based abuse reached the EU’s agenda. “Since article 24b made it to the council negotiations, it was finally discussed at the high political level. We hope that discussion will continue,” she said.

In March, the European Commission adopted a new directive on combating gender-based violence. The directive aims to criminalise different forms of gendered violence – including non-consensual image sharing – and wants to “strengthen victims' access to justice”.

However, Geese points out that it contains no obligations for online platforms, and that criminalisation doesn’t mean it will become any easier to track down perpetrators. “We also don’t know if and when this will end up being discussed at the European Council. It might take a lot of time. It seems that, in general, legislation related to women’s rights doesn’t seem to be a priority,” she said.

Three years after images of her appeared online without her knowledge or consent, Emma Holten decided to publish her own naked photos on the internet. The images – captured by Danish photographer Cecilie Bødker – show Holten topless, reading on a window sill, brushing her teeth or sitting on her bed.

The photo session was part of the project Consent, Holten’s attempt to counter the impact that digital abuse has had on her life. “I [wanted to show that] I wasn’t ashamed of my naked photos being online – the issue is when they are spread without your consent,” she said.

Get 50.50 emails Gender and social justice, in your inbox. Sign up to receive openDemocracy 50.50's monthly email newsletter.

Comments

We encourage anyone to comment, please consult the oD commenting guidelines if you have any questions.
Audio available Bookmark Check Language Close Comments Download Facebook Link Email Newsletter Newsletter Play Print Share Twitter Youtube Search Instagram WhatsApp yourData