Channel 4 News host Cathy Newman says she felt 'utterly dehumanised' and ... trends now

Channel 4 News host Cathy Newman says she felt 'utterly dehumanised' and ... trends now
Channel 4 News host Cathy Newman says she felt 'utterly dehumanised' and ... trends now

Channel 4 News host Cathy Newman says she felt 'utterly dehumanised' and ... trends now

Channel 4 journalist Cathy Newman has described the 'dehumanising' moment she came face to face with an explicit deepfake pornography video of herself.

The 49-year-old, who fronts the channel's evening news bulletins, was investigating videos made with artificial intelligence when she was made aware of a clip that superimposed her face onto the body of an adult film actress as she had sex.

Ms Newman is one of more than 250 British celebrities believed to have been targeted by sick internet users who create the uncannily realistic videos without the consent of their victims - which is set to become a criminal offence in Britain.

Footage of the veteran reporter viewing the video of her computer-generated doppelganger aired in a recent report on Channel 4 News; she says the experience has haunted her, not least because the perpetrator is 'out of reach'.

The investigation into five popular deepfake sites found more than 4,000 famous individuals who had been artificially inserted into adult films without their knowledge to give the impression they were carrying out sex acts.

Channel 4 journalist Cathy Newman says she felt 'utterly dehumanised' after viewing a deepfake pornography clip featuring her face imposed on an adult actress

Channel 4 journalist Cathy Newman says she felt 'utterly dehumanised' after viewing a deepfake pornography clip featuring her face imposed on an adult actress

This is not Cathy Newman - but a deepfake video featuring her face superimposed on that of an adult actress in a pornographic film

This is not Cathy Newman - but a deepfake video featuring her face superimposed on that of an adult actress in a pornographic film 

Deepfake apps have been advertised on social media despite pledges to crack down on their proliferation

Deepfake apps have been advertised on social media despite pledges to crack down on their proliferation

" class="c7" scrolling="no"

Footage aired as part of the report showed a clearly disturbed Newman watching as her AI-generated double crawled towards the camera.

'This is just someone's fantasy about having sex with me,' she says as she watches the disturbingly authored clip.

'If you didn't know this wasn't me, you would think it was real.'

What is a deepfake? 

Deepfakes are fake images of people named after 'deep learning', the machine learning AI technique that is trained on patterns in data to 'learn' how to recognise and predict things.

Deepfake software trained on images of human faces can attempt to layer the facial features from an image - such as one of a person's face - onto an another face in an existing piece of video footage.

This can be impressive - as in a clip featuring Marvel actors Robert Downey Jr and Tom Holland swapped into a scene from Back To The Future - but can easily be used maliciously if faces are imposed on adult films. 

Some sick apps also offer to remove swimwear from images - meaning anyone who publicly shares an image of themselves on a beach holiday, for instance, could find themselves maliciously rendered artificially nude.

Deepfakes can also take the form of AI-generated images of real people - such as sexually explicit images of Taylor Swift that were circulated on social media in January, and viral pictures of former US president Donald Trump appearing to smile with Black people.

Advancements in AI mean it is harder to spot a deepfake than ever before - but imperfections can still be identified, such as mismatched skin tones, inconsistent blinking and changes in image quality, as well as other AI hallmarks such as poorly rendered details around hands.

Experts advise caution around images that are shared online if they are suspected to be deepfakes - particularly if they have not come from trustworthy and reliable sources.

Advertisement

Writing in a national newspaper, Ms Newman thought she would be 'relatively untroubled' by watching a video a twisted stranger had made that superimposed her face on that of an adult performer - but came away from the experience 'disturbed'.

'The video was a grotesque parody of me. It was undeniably my face but it had been expertly superimposed on someone else's naked body,' she said in The Times.

'Most of the "film" was too explicit to show on television. I wanted to look away but I forced myself to watch every second of it. And the longer I watched, the more disturbed I became. I felt utterly dehumanised.

'Since viewing the video last month I have found my mind repeatedly

read more from dailymail.....

PREV Sydney terrorism raids: Teen's alleged texts to carry out attack against ... trends now
NEXT Doctors first 'dismissed' this young girl's cancer symptom before her parents ... trends now