I was sickened to discover I'd been turned into a deepfake porn victim - ... trends now

I was sickened to discover I'd been turned into a deepfake porn victim - ... trends now
I was sickened to discover I'd been turned into a deepfake porn victim - ... trends now

I was sickened to discover I'd been turned into a deepfake porn victim - ... trends now

Sitting at my laptop I watched a naked woman with my face having graphic penetrative sex in a variety of positions with a nude man. The pornographic video lasted for three minutes and 32 seconds, and grotesque as I found it, I made myself watch it all. I needed to understand exactly how realistic these images are, and also recognise how easy it was for people to access them online.

Because as seamless as the footage appeared, it wasn’t me at all – my face had been imposed on to another woman’s body using artificial intelligence (AI) to make what is known as ‘deepfake’ pornography.

The video had been unearthed by my Channel 4 colleagues while researching the exponential and alarming rise of deepfake porn for a special report which was broadcast last month.

A deepfake pornography video of Channel 4 broadcaster Cathy Newman was uncovered by her colleagues while researching the rise of the technology

A deepfake pornography video of Channel 4 broadcaster Cathy Newman was uncovered by her colleagues while researching the rise of the technology

Out of 4,000 celebrities they found featured in deepfake porn videos online, 250 were British – and one of them was me.

None of the celebrities we approached to comment on this would go public. While disappointing, I understood – they didn’t want to perpetuate the abuse they’d fallen victim to by drawing more attention to it.

But for our investigation to have maximum impact, I knew that I needed to speak out.

In my 18 years as a Channel 4 journalist I have, sadly, seen plenty of distressing footage of sexual violence. So while I was nervous about becoming part of the story, I thought I would be inured to the contents of the video itself.

But actually it left me disturbed and haunted. I’d been violated by a perpetrator whom, as far as I know, I’ve never met, and I was a victim of a very modern crime that risks having a corrosive effect on generations of women to come.

I also felt vindicated by my decision to go public, because earlier this month the Government announced that the creation of these sexually explicit deepfakes is to be made a criminal offence in England and Wales.

I understand that Laura Farris, the Minister for Victims and Safeguarding, was motivated in part to take action after watching our investigation. This comes after the sharing of this type of content was outlawed in the Online Safety Bill last year.

My colleagues were already researching deepfake pornography when, in January, fake explicit images of the singer Taylor Swift went viral on X/Twitter, with one image viewed 47 million times before it was taken down.

Suddenly the alarming scale of the problem became clear. We found the four most popular deepfake porn sites hosting manipulated images and videos of celebrities had had almost 100 million views over just three months, with more deepfake porn videos created in 2023 than all the years combined since 2017.

The videos have been viewed in total more than 4.2 billion times.

You might think some degree of technical expertise is required to make them, but it’s incredibly easy and done mostly using smartphone ‘nudify’ apps – there are more than 200 available. Users submit a picture – one single photograph of someone’s face grabbed from social media is all that’s needed – and this is used to create a horrifyingly realistic explicit image.

Because of the sheer number of celebrity pictures online, we hear about high-profile personalities becoming victims most often. They include American congresswoman Alexandria Ocasio-Cortez, who this month described the trauma of discovering she had been targeted while in a meeting with aides in February, and Italian prime minister Giorgia Meloni, who is seeking damages after deepfake videos of her were uploaded online.

But arguably the greater victims are the hundreds of thousands of women without a public platform to denounce the images as deepfake – the women who might be in a meeting or job interview and not know whether the people opposite them have seen and been taken in by the fake footage.

The recreation of the broadcaster. Out of 4,000 celebrities they found featured in deepfake porn videos online, 250 were British – and one of them was me, Cathy writes

The recreation of the broadcaster. Out of 4,000 celebrities they found featured in deepfake porn videos online, 250 were British – and one of them was me, Cathy writes

I spoke to one such victim, Sophie Parrish, 31, a florist and mother-of-two from Merseyside, whose deepfake porn video was uploaded to a website by someone close to her family, which men then photographed themselves masturbating over. She was physically sick when she found out, and the impact on her since has been profound.

A beautiful woman, she’s lost confidence and now doesn’t want to put on make-up for fear of attracting attention. She almost blames herself, although obviously there’s no blame attached. And

read more from dailymail.....

PREV Barron Trump will not be a delegate at Republican convention after all mogaznewsen
NEXT Pensioner Kelvin lost his wallet on a bus. Soon he started receiving other ... trends now