LOVE Island legend Cally Jane Beech has spoken of her horror after discovering fake nude photos of her have circulated online.
The reality TV star took to Instagram to warn fans of the dangers of AI deep fake images after being sent a naked photo that looked like her.
The 32-year-old addressed her followers in a video while sitting in her bedroom, and said: "It's actually quite scary when I think about it.
"I was sent a picture from someone that said, 'this is a nude picture of you', and I said that's impossible because I've never sent a nude picture or posted any nude pictures anywere.
"It was what looked like to be me my face, my body, but an AI version of boobs and private parts. First thing I did was panic."
Cally initially posted a warning to her 500k Instagram followers last night, but deleted the post saying she didn't want to promote the website that had generated the images.
READ MORE ON LOVE ISLAND
The mum - who recently spoke to The Sun about her Love Island ex Luis Morrison - said she had gone to the police and "people are looking into" but said she feared there was "no legisation or legal backing" on such images.
She continued: "Obviuosly someone has gone on there and done that picture of me and put it on teh internet.
"I want to raise awareness on this I think it is massively important.
"I knew there was such a thing as AI but I didn't realise this kind of thing was happened.
Most read in TV
"I was so shook last night, I wanted to cry, the next minute me and (her partner) DJ were laughing, but I knew it was so serious."
Manipulated videos and pictures can be very hard to detect.
But Cally said she knew mmediately that the photos were faked.
She wrote: "When they sent me the picture they was shown I instantly knew it was a deep fake, coz for one I know what my body looks like...
Deepfake nudes - and why they are now a huge problem
DEEPFAKES have become more commonplace as technology continues to evolve, making it harder to differentiate what is reality.
Sharing deepfake intimate images is due to be criminalised in England and Wales as the legal framework rapidly changes to catch up with technology.
But fake nudes posted on seedy websites are going viral. In 2021 it was reported one site garnered 38million hits in its first year.
It is often unclear who is behind the sites.
Deepfakes are made using complex artificial intelligence (AI) technology to manipulate images and videos.
It can make the subject of a photo or clip look like they are saying or doing something that they didn't.
It's one level up from dubbing, or lip syncing, and can appear very convincing.
Last year mums joined forces after girls were blackmailed over images made with artificial intelligence tech.
The victims' ages are believed to be between 11 and 17, the reported.
She said the image was also missing identifying marks such as a tattoo on her lower abdomen.
Cally continued: "I looked into it and I have found there are sites that allow you to remove peoples clothes/underwear appearing them to be naked."
The star vowed "I will get to the bottom of this" and warned her fans: "Be mindful what you see on the internet or someone may send you is not real and can be very soul destroying or affect someone hugely.
"Please be aware guys. The internet is a scary, scary place."
Not even someone with the power and influence of Taylor Swift is safe from the vile abuse of modern tech.
The American superstar is currently waging war on social media bosses after truly grim AI-created images of her in intimate positions were shared on social media.
One of the deepfake pics was viewed 47million times on X/Twitter before it was removed.
It prompted social platform X to put a block on searching for Taylor’s name as a “temporary action” to prioritise safety.
Deep fake images of real women being undressed have been around for several years now and have also become a problem in schools.
An example of a fake video is one created in 2019 of Boris Johnson.
The clip emerged online to show the then-prime minister and Jeremy Corbyn endorsing each other to lead the UK.
READ MORE SUN STORIES
An investigation last year found cyber-crooks are selling deepfake revenge porn on the dark web for up to £15,800 for minute-long clips.
Twitch and OnlyFans star Indiefoxx revealed earlier this year she was blackmailed with AI-generated deepfake nudes.
Keep up to date with the latest news, gossip and scandals on our celebrity live blog