Horror rise of deepfake ‘porn’ photos as Love Island star reveals how she was targeted & victims tell of ‘haunting’ pics
WHEN Love Islander Cally Jane Beech opened a message from a friend, she was shocked to find they had discovered a nude picture of her online.
As an influencer, Cally Jane is no stranger to posing in sexy lingerie and bikinis but she has never once stripped off.
Yet when she put her name into Google last January, her “naked” image was there for all to see.
The stunning star is just one of a string of celebrities — and thousands of ordinary people — whose pictures have been “deepfaked” by AI tech to make them appear naked.
The 33-year-old said: “When I was told there was a nude image on the internet I knew it couldn’t be me because I’ve never had an explicit image of me taken, ever.
“When I saw the image I knew straight away that it had come from a shoot I’d done and been altered by AI to make it look like I was wearing nothing.
Read More NEWS
“It was on a site that promoted this AI technology and I was being used as an advertisement to divert people to click on porn sites, maybe thinking that they would see me there.
“I was like, ‘Oh my God’. It was shocking and just felt so personal.”
Cally Jane complained to police who helped her get the fake image removed — but they were unable to nail the culprits because the site was hosted outside the UK.
The Government this month announced a crackdown on abusers who make sexually explicit deepfakes and has warned they could be jailed for two years under new laws.
Legislation will mean offenders can be charged for both creating and sharing the pictures.
But campaigners fear it may not be enough to stop the trend and are calling on social media platforms to ban adverts for so-called“ nudify” apps.
They are waiting to find out whether the new legal measures will close a loophole that makes it hard to prosecute men if they claim they didn’t mean to cause distress.
One victim even called for culprits to be placed on the sex offenders register as a deterrent to others.
Cally Jane, who will wed SAS: Who Dares Wins contestant DJ O’Neal later this year, said the legislation must be watertight.
The former dental nurse added: “I hope this law gives the police and CPS the powers to prosecute these men. But you still have a problem that many of the sites hosting the images are outside the jurisdiction of British police.
“It’s up to the platforms to sort that now.”
The star was flooded with messages from other victims when she went public on Instagram about her deepfake nightmare last January.
‘Took their own life’
She said: “The key thing that struck me was how many other women had been impacted.
“People messaged to tell me all sorts of things including about a 14-year-old girl who took her own life after a bully created a fake naked image of her and sent it around their school.
“We all share our pictures on holidays, pictures of us in bikinis or with partners etc, and the fact someone can just take these and do what they see fit with it and there’s little that can be done is awful.”
Channel 4 News host Cathy Newman was left “haunted” after finding a deepfake pornography film of herself while she was investigating the twisted phenomenon.
The programme found at least 250 British stars have had their faces or bodies superimposed on X-rated images.
Cathy told The Sun last April: “My first reaction was, this is just bizarre.
“Then the longer I watched, I got more and more disturbed because it’s really graphic sexual activity.
“It just feels violating because it looks really realistic. If you didn’t know that wasn’t my body, you’d assume I’d put that video out there.”
Singer Taylor Swift, plus actresses Angelina Jolie and Emma Stone, are among big Hollywood names who have been deepfaked, but thousands of other women are popping up “naked” online too.
Ex-paratrooper Jonathan Bates, 54, was this month jailed for five years after posting deepfake porn profiles of four women.
Bates, who served for 15 years and was awarded the Northern Ireland medal before retraining as a teacher, put doctored images on websites offering sexual services.
One of his victims, who worked as a safeguarding lead at the same primary school in Canterbury, Kent, said she was “left with scars that will never heal”.
It just feels violating because it looks really realistic. If you didn’t know that wasn’t my body, you’d assume I’d put that video out there
Cally-Jane Beech
Kirsty Pellant, 44, was first made aware of the images when a man, who tried to arrange a date with her fake profile, turned up at her house in 2017.
She told the Sunday Times: “This type of abuse can have devastating and long-lasting effects on victims.
“I know when I was trying to get some of the websites to listen and help, I couldn’t get anywhere.”
She identified Bates, who was jailed for stalking and revenge porn, after contacting other victims. They all worked together and found Bates was a mutual connection on Facebook.
In another shocking case, university student “Jodie” was left dumbfounded when she got an anonymous email with a link to a porn website in March 2021.
When she logged on, she found explicit images and a video that appeared to show her having group sex.
‘Sick pervert’
Another picture showed her in a school uniform. Her face had been deepfaked on to another woman’s body.
Horrifyingly, the culprit turned out to be her best male friend, a former BBC Young Composer of the Year winner.
Worse still, her then-pal Alex Woolf, 29, had been a “shoulder to cry on” when Jodie — not her real name — began seeing pictures of herself appearing up on dating apps in 2016.
They then started to appear on Twitter and “revenge porn” sites where men post pictures of ex-partners.
Jodie, 27, from Cambridgeshire, said: “The entire time we’d been friends he was my shoulder to cry on.
“Every time my fake picture showed up on a dating site I’d screenshot it and send it to Alex and ask ‘What sick pervert is doing this to me?’ and he’d be full of sympathy. I never sent him any of the more explicit images because by that time I had my suspicions.
“The sender of the anonymous email that tipped me off gave a description of the person behind the images.”
Two months later she was sent a second strange email which directed her to another image that she knew Woolf had been cropped out of.
Jodie said: “We were the only two people who had that picture — that was the smoking gun I needed, it was definitely him. It was the ultimate betrayal.”
Instead of confronting Woolf, Jodie went to the police and in August 2021, he admitted 15 charges of sending grossly offensive messages under the communications act.
It was his language and not the deepfake pictures that led to him being given a 20-week prison sentence, suspended for two years.
Campaigners hope the new laws being ushered in will see offenders prosecuted specifically for creating explicit deepfake images.
The new laws must make it clear that non-consensual sharing is illegal no matter what the motive
Professor Clare McGlynn
But, as they await details, they are concerned about a loophole that has allowed offenders to argue they were “joking” when they used AI to “strip” women. It has made prosecution difficult for police.
Professor Clare McGlynn, a legal expert in pornography, said: “The new laws must make it clear that non-consensual sharing is illegal no matter what the motive.
“Previous laws mean the police have had to prove intent to distress the victim and that’s been a challenge and led to few prosecutions.
“This is about making the law comprehensive so it also puts pressure on the social media platforms to remove deepfake content and sites which advertise the technology.”
Rebecca Hitchen, head of policy and campaigns at the End Violence Against Women coalition, echoed the calls for legal reform.
She said: “We await confirmation that any new law criminalising the creation of sexually explicit deepfakes will be based on consent, rather than the perpetrator’s intent.”
The Ministry of Justice told The Sun more details are still to be released about the new legislation.
Jodie and Cally Jane both believe that those who create deepfakes should be put on the sex offenders register.
Jodie said: “They should be lumped in with sex offenders because it’s abuse.”
READ MORE SUN STORIES
Cally Jane said: “It would be a huge deterrent if these men were put on the list. It would make people think twice about doing this.
“I don’t think people will really take this seriously until there are some big prosecutions.”
‘STAMP OUT VILE ABUSE’
By Baroness Owen, Anti-deepfake campaigner
DEEPFAKE “porn” is a sickening violation. It’s an epidemic, 99 per cent of victims are women, and it is currently totally legal to create.
It is quite simply abuse and that’s why last September I introduced a bill to criminalise it and why I’m now tabling amendments to the data bill.
The rapid development of generative AI, plus the ease of accessibility to the apps and platforms that can create this degrading content, means deepfake “porn” can be created by anyone, anywhere, any time.
This violating practice has proliferated rapidly since it first emerged around 2017.
Analysis found that the top 40 sites dedicated to this abuse received around 40MILLION hits between them in one month.
While it is now illegal to share non-consensual sexually explicit content, both real and deepfake, it is still not illegal to create it, meaning we have a gaping omission in legislation.
There are vast numbers of online forums dedicated to image-based abuse. They are populated by men who encourage each other to create sexually explicit deepfakes.
I am pleased that finally, after a long push by the victims and campaigners I have worked with, the Government announced it will be legislating this year to criminalise this abhorrent practice.
However, any law omitting “solicitation” would leave a gaping loophole, yet the Government has so far refused to commit on this.
Shockingly, it has further refused multiple times to commit to the legislation being “consent based”.
A consent-based offence would mean victims would not have to suffer the trauma of having to prove the perpetrator’s motivation – a woman’s lack of consent should be enough.
A clear path to justice is needed so victims do not also have to suffer the distress of sexually explicit content remaining in the hands of their abuser.
Campaigners like myself will not stop pushing for these vital protections.