The gov­ern­ment should send a clear mes­sage to cre­ators and sites that adver­tise this abu­sive con­tent Most people’s fears about AI are focused on the future. But we’re not pay­ing near­ly enough atten­tion to how these tech­nolo­gies are already dra­mat­i­cal­ly increas­ing cas­es of sex­u­al abuse in the present.Bon­sai Casi­no Take deep­fake pornog­ra­phy, a form of image-based sex­u­al abuse in which dig­i­tal­ly altered sex­u­al images of vic­tims are cre­at­ed and shared with­out people’s con­sent. This is ris­ing dra­mat­i­cal­ly in Britain: Google “deep­fake porn” and the top results will not be crit­i­cal dis­cus­sions of this abuse, but sites where one can buy and access these abu­sive images. I’ve been writ­ing about sex­u­al abuse for years, and I’m deeply con­cerned that we’re not doing enough to stop this. In recent months, peo­ple have shared dig­i­tal­ly altered sex­u­al images of the new deputy prime min­is­ter Angela Rayn­er and celebri­ties includ­ing Tay­lor Swift. But you don’t need to be famous to appear in one of these images or videos – the tech­nol­o­gy is read­i­ly acces­si­ble, and can eas­i­ly be used by ex-part­ners or strangers to humil­i­ate and degrade. As a tech lud­dite, I was still under the impres­sion that one need­ed some dig­i­tal skills to com­mit this kind of abuse. Not so. You can sim­ply take someone’s image, put it into a “nud­i­fy” app, and the app’s AI will gen­er­ate a fake nude pic­ture. “It’s quick and easy to cre­ate these images, even for any­one with absolute­ly no tech­ni­cal skills,” Jake Moore, an advis­er at a cyber­se­cu­ri­ty firm, told me. The impact of this kind of abuse on vic­tims is trau­mat­ic and dan­ger­ous: first, there is the covert theft of your image; then, the trau­ma of it being “nud­i­fied”; and then the re-trau­ma­ti­sa­tion that occurs when the image is shared online with oth­er peo­ple. Vic­tims of this abuse have report­ed seri­ous men­tal health con­se­quences. One woman told this news­pa­per she expe­ri­enced repeat­ed night­mares and para­noia after she was the tar­get of deep­fake images. Anoth­er, Rana Ayyub, who has also spo­ken pub­licly about being a tar­get, expe­ri­enced so much harass­ment as a result of a deep­fake pornog­ra­phy image that she had to approach the Unit­ed Nations for pro­tec­tion. So how can we stop it, and why aren’t we doing so? The now-top­pled Con­ser­v­a­tive gov­ern­ment had planned to intro­duce a bill to address the alarm­ing pro­lif­er­a­tion of deep­fake pornog­ra­phy by mak­ing it a crim­i­nal offence, but the bill had seri­ous gaps that would leave vic­tims exposed, and gave per­pe­tra­tors too much free­dom to con­tin­ue cre­at­ing these images. In par­tic­u­lar, the bill didn’t cov­er all forms of deep­fake pornog­ra­phy – includ­ing those that used emo­jis to cov­er gen­i­tals, for exam­ple – and it required proof of motives, such as that the per­pe­tra­tor intend­ed to use the image for sex­u­al grat­i­fi­ca­tion. This is a prob­lem on sev­er­al lev­els: first, it leaves per­pe­tra­tors open to argu­ing that they sim­ply cre­at­ed the images “for a laugh” (I’m think­ing of Don­ald Trump’s “lock­er room talk” com­ments), or even for “artis­tic pur­pos­es” (God help us). And this brings us to one of the major prob­lems with this type of abuse. In cer­tain cir­cles, it can mas­quer­ade as some­thing that is fun­ny or that we should take as “a joke”. This feeds into a cer­tain type of mas­cu­line behav­iour that has been on the rise in the wake of the #MeToo move­ment, which attempts to down­play forms of sex­u­al abuse by accus­ing women of tak­ing “lad­dish” behav­iour too seri­ous­ly. Sec­ond, in putting the bur­den on the pros­e­cu­tion to prove the motive of the per­pe­tra­tor, this sets a very high – per­haps impos­si­ble – bar for a crim­i­nal pros­e­cu­tion. It’s very dif­fi­cult to prove what a per­pe­tra­tor was think­ing or feel­ing when they cre­at­ed deep­fake porno­graph­ic images. As a result, police forces may be less will­ing to charge peo­ple for these crimes, mean­ing there will be few­er con­se­quences for per­pe­tra­tors. A recent Labour par­ty ini­tia­tive looked at address­ing some of these issues, so I’ll be watch­ing to see if these gaps are filled in any forth­com­ing leg­is­la­tion. There are a num­ber of things the par­ty could do to clamp down on these crimes – and oth­er things we could be doing now. We could be pur­su­ing civ­il reme­dies for deep­fake pornog­ra­phy, for instance, which can be a quick­er and more effec­tive action than going through the crim­i­nal jus­tice sys­tem. New rules allow­ing courts to take down images swift­ly could also be a huge help to vic­tims. But there’s an even big­ger issue that we’ll need to tack­le: the search engines and social media sites that pro­mote this type of con­tent. Clare McG­lynn, a pro­fes­sor at Durham Uni­ver­si­ty who stud­ies the legal reg­u­la­tion of pornog­ra­phy and sex­u­al abuse, told me that she had been dis­cussing this prob­lem with a promi­nent tech­nol­o­gy com­pa­ny for sev­er­al months, and the com­pa­ny had still not changed the algo­rithm to pre­vent these web­sites from show­ing up at the top of the first page. The same is true of social media sites. Both McG­lynn and Moore say that they have seen deep­fake web­sites adver­tised on Insta­gram, Tik­Tok and X. This is not just a dark web prob­lem, where ille­gal or harm­ful con­tent is hid­den away in the sketchi­est reach­es of the inter­net. Deep­fake pornog­ra­phy is being sold open­ly on social media. In the­o­ry, this should make the prob­lem eas­i­er to tack­le, because social media sites could sim­ply ban these kinds of adverts. But I don’t have much faith: as a female jour­nal­ist, I’ve had plen­ty of abuse on social media, and have nev­er received a response when I’ve com­plained about this to social media com­pa­nies. This is where our reg­u­la­tors should step in. Ofcom could start pun­ish­ing search engines and social media sites for allow­ing deep­fake adverts. If the gov­ern­ment made deep­fakes a crim­i­nal offence, the reg­u­la­tor would be forced to act. Our new prime min­is­ter has already made it clear that his gov­ern­ment is all about change. Let’s hope that pro­tect­ing the vic­tims of sex­u­al abuse and stem­ming the tide of deep­fake pornog­ra­phy is part of this. Lucia Osborne-Crow­ley is a jour­nal­ist and author Do you have an opin­ion on the issues raised in this arti­cle? If you would like to sub­mit a response of up to 300 words by email to be con­sid­ered for pub­li­ca­tion in our let­ters sec­tion, please click here.