Epidemic of deepfake ‘porn’ is a sickening violation… campaigners like myself won’t stop pushing for vital protections
DEEPFAKE “porn” is a sickening violation.
It’s an epidemic, 99 per cent of victims are women, and it is currently totally legal to create.
It is quite simply abuse and that’s why last September I introduced a bill to criminalise it and why I’m now tabling amendments to the data bill.
The rapid development of generative AI, plus the ease of accessibility to the apps and platforms that can create this degrading content, means deepfake “porn” can be created by anyone, anywhere, any time.
This violating practice has proliferated rapidly since it first emerged around 2017.
Analysis found that the top 40 sites dedicated to this abuse received around 40MILLION hits between them in one month.
Read More on Opinion
While it is now illegal to share non-consensual sexually explicit content, both real and deepfake, it is still not illegal to create it, meaning we have a gaping omission in legislation.
There are vast numbers of online forums dedicated to image-based abuse.
They are populated by men who encourage each other to create sexually explicit deepfakes.
I am pleased that finally, after a long push by the victims and campaigners I have worked with, the Government announced it will be legislating this year to criminalise this abhorrent practice.
Most read in The Sun
However, any law omitting “solicitation” would leave a gaping loophole, yet the Government has so far refused to commit on this.
Shockingly, it has further refused multiple times to commit to the legislation being “consent based”.
A consent-based offence would mean victims would not have to suffer the trauma of having to prove the perpetrator’s motivation – a woman’s lack of consent should be enough.
A clear path to justice is needed so victims do not also have to suffer the distress of sexually explicit content remaining in the hands of their abuser.
Campaigners like myself will not stop pushing for these vital protections.