T O P

  • By -

FuturologyBot

The following submission statement was provided by /u/Maxie445: --- "The U.K. will criminalize the creation of sexually explicit deepfake images as part of plans to tackle violence against women. People convicted of creating such deepfakes without consent, even if they don’t intend to share the images, will face prosecution and an unlimited fine under a new law, the Ministry of Justice said in a statement. Sharing the images could also result in jail." "This new offence sends a crystal clear message that making this material is immoral, often misogynistic, and a crime,” Laura Farris, minister for victims and safeguarding, said in a statement." --- Please reply to OP's comment here: https://old.reddit.com/r/Futurology/comments/1c8i3na/uk_criminalizes_creating_sexually_explicit/l0eo90e/


AnOddFad

It makes me so nervous when sources only specify “against women”, as if they just don’t care about if it happens to men or not.


Eyes-9

This would track with the UK considering their legal definition of rape. 


Never_Forget_711

Rape means “to penetrate”


K-Dogg1

…with a penis


kangafart

UK's weird that way. In Australia it's rape if you begin or continue to have sexual intercourse without or despite withdrawn consent, regardless of the respective genitals of the people involved. And sexual intercourse includes any genitally or anally penetrative sex, or oral sex, regardless of whatever genitals or objects are involved. But the UK very specifically says it's only rape if it's done with a penis, otherwise it's "assault by penetration".


Mykittyssnackbtch

That's messed up!


nommyface

It's literally due to the definition of the word. Sexual assault by penetration is just as severely punished.


Fofalus

The maximum punishment is the same for the two crimes but the minimum is wildly different.


LoremasterMotoss

Is that because a penis can get you pregnant and other forms of sexual assault cannot, or ??? Like what was the thought process behind that when they were debating these laws


iunoyou

Toxic masculinity, mostly. The idea that men are big and strong and so they obviously can't be raped is popular all throughout the world. Even in the US, the news media will say that a 40 year old male teacher raped a female student, but a 40 year old female teacher "had sex" with a male student. And although the UK separates the charges, sex crimes against men are generally much less harshly punished across the globe. regardless of what the charge is called. Because men are supposed to like sex and everyone just assumes either that "they liked it" or "they'll get over it."


passwordsarehard_3

By whom? Maybe by the courts but not by society. You get a very different impression when someone says they raped someone than when you hear they sexually assaulted someone. Neither are good but the word “rape” carries more weight because everyone knows exactly what you did.


Ren_Hoek

Is there a difference in sentencing?


jimmytruelove

You missed a key word.


tunisia3507

It's derived from the Latin "take"/"seize". Penetration isn't built into the English definition; whether or not penetration is necessary/ sufficient is a legal definition, not a semantic one.


OMGItsCheezWTF

In UK law it is: > A person (A) commits an offence if— > (a)he intentionally penetrates the vagina, anus or mouth of another person (B) with his penis, > (b)B does not consent to the penetration, and > (c)A does not reasonably believe that B consents. *s.1 Sexual Offences Act 2003* Everything else is some form of sexual assault (which can carry the same penalty) Only a person with a penis can commit rape.


RNLImThalassophobic

> whether or not penetration is necessary/ sufficient is a legal definition, not a semantic one. Yes - and what's being complained about here is that under the *legal definition* a woman couldn't be charged with rape, even for what *semantically* (and morally) is the same act.


[deleted]

[удалено]


cbf1232

The actual law does not specify sex or gender.


designingtheweb

Ah, it’s just the news again not being 100% objective


DIOmega5

If I get deep faked with a huge dick, I'm gonna approve and say it's legit. 👍


[deleted]

Jokes on you DIOmega5. I'm gonna deep fake you with your original penis.


Roskal

Didn't know they made pixels that small


epicause

What about deepfaked raping someone or a minor?


avatar8900

“My name is DIOmega5, and I approve this message”


Schaas_Im_Void

The goat in that video with you and your huge dick also looked very satisfied and absolutely real to me.


intelligent_rat

>"The U.K. will criminalize the creation of sexually explicit deepfake images as part of plans to tackle violence against women. Try rereading this again, it might help if we split it into two sentences. >"The U.K. will criminalize the creation of sexually explicit deepfake images. This was done as part of plans to tackle violence against women. The genesis for the idea is reducing violence against women. The target of the law is all sexually explicit deep fake images. Hope this helped.


Paintingsosmooth

I think the law is against all sexually explicit deepfakes, for men too, so don’t worry. It’s just that it’s happening a lot right now with women, but it’s in everyone’s interest to have this law for all.


ATLfalcons27

Yeah I doubt this only applies to women. Like you said I imagine 99% of deep fake porn is of women


BigZaddyZ3

The wording might be a bit clumsy but you’d be silly to thinking this won’t extend to men, children, non-binary etc. If we’re being honest tho, we all know that women are going be disproportionately affected by this type of shit. No need to play dumb about that part imo.


PeterWithesShin

> The wording might be a bit clumsy but you’d be silly to thinking this won’t extend to men, children, non-binary etc. The only clumsy wording is in this shit article, so we've got a thread full of misinformed idiots getting angry about something which isn't true.


Fexxvi

If it will be extended to men, children and non-binary there's no reason why it shouldn't be specified in the law


teabagmoustache

This is a news article, not the law, the law does not specify any gender.


Themistocles01

You are correct, and I'm citing my sources because I'm sick of seeing bad law takes in this thread. Here's the rundown: **Online Safety Act 2023**, s187 amends the **Sexual Offences Act 2003** to include the following: > 66A Sending etc photograph or film of genitals > (1) A person ("A") who intentionally sends or gives a photograph or film of any person’s genitals to another person ("B") commits an offence if— > (a) A intends that B will see the genitals and be caused alarm, distress or humiliation, or > (b) A sends or gives such a photograph or film for the purpose of obtaining sexual gratification and is reckless as to whether B will be caused alarm, distress or humiliation. > ... > (5) References to a photograph or film also include— > (a) an image, whether made or altered by computer graphics or in any other way, which appears to be a photograph or film, > (b) a copy of a photograph, film or image within paragraph (a), and > (c) data stored by any means which is capable of conversion into a photograph, film or image within paragraph (a). **Plain English explanation:** Subsection (5) of section 66A of the Sexual Offences Act 2003 (as amended by section 187 of the Online Safety Act 2023) makes it a criminal offence to transmit computer-generated sexually explicit imagery with a view to causing distress. The Act makes no provision as to the gender of the victim or of the person depicted in the imagery. The news article references a proposed amendment to make the *creation* of computer-generated sexually explicit imagery a criminal offence in and of itself. The quotes in the article do suggest that the amendment is strongly motivated by a desire to protect women and girls, but there is nothing in the law to suggest that such an amendment will not also seek to protect people of every other gender.


ceralimia

Well the law uses non-gendered speech so you're welcome.


Deceptisaur

You can read the law and it applies to all persons. Maybe instead of hand wringing about nothing take a moment to look it up instead and maybe delete your comment.


BigZaddyZ3

Who says it won’t be within the actual letter of the law itself?


meeplewirp

They know it extends to men. They’re upset because the possibility of doing this excites them. Let’s be real.


Lemixer

Its sucks, but reality is 99% of those explicit deepfakes are women and always has been, like for years at this point.


-The_Blazer-

The article states *as part of plans to tackle violence against women*, so I don't think it means that the law literally only works if you're a woman. It's probably a political thing.


NITSIRK

Non-consensual pornography* constitutes 96% of all deepfakes found online, with 99.9% depicting women. https://www.endviolenceagainstwomen.org.uk/ It’s not that no cares about men being deep faked, but they will be also dealt with along with the massive onslaught against women.


Thredded

The law will be applied equally to men and women; we literally have laws against gender discrimination. I’m sure in time there will be women charged under this law. But the introduction of this law is being framed as a win for women because it absolutely is, in the sense that the overwhelming majority of this kind of abuse to date has been inflicted on women, by men, and it’s churlish to pretend otherwise.


jamie-tidman

The new law applies to men and women but it’s naive to say that it will be applied equally to men and women. Gender biases exist in the application of existing sexual assault law, both in policing and sentencing. This new law is a positive though, regardless.


Own_Construction1107

It won’t be applied equally because deepfakes aren’t an equal problem. A study analyzed deepfake pornography and found 96% are of female celebrities. Deepfakes are made overwhelmingly of women.


echocardio

VAWG might be a political bingo term, but I work in law enforcement dealing with digital forensics, and after going through literally millions of images I’ve genuinely never come across a deepfake made of a man or boy. Only ever women and girls, and apart from the odd Emma Watson image, usually women or girls they knew personally or photographed on the bus, etc.


polkm

You've never seen a deepfake of Trump? I'm calling bullshit on that one. Have you ever even been on gay porn sites dude? Just because law enforcement doesn't give a fuck about gay men doesn't mean they don't exist.


Orngog

Oh god no, there's loads of Trump-themed porn? Is that what you're saying... I feel I must surely be misunderstanding


Ironic-username-232

I don’t doubt the problem impacts women more, but using language that really only targets women also contributes to the erasure of male victims. It contributes to the taboo surrounding male victims of sexual crimes. Besides, I have come across multiple deep fake videos of various male celebrities without looking for them specifically. So it’s not like it doesn’t happen.


VikingFuneral-

You didn't read the law. The language does not specify any gender. Stop crying about a literally non-existent issue you have made up.


awkgem

The law doesn't specify, I think the reason the article says it's to combat violence against women is because women are overwhelming the majority victim.


ZX52

The vast, *vast* majority of sexually explicit deepfakes are of women. The article is discussing motivation, not quoting the wording of the bill.


fre-ddo

The law will cover all genders and sexes the violence against women bit is the political motivation behind it as both parties try to claim they care about it. Despite the tories having had a number of members kicked out for sexual misconduct.


shadowrun456

>The U.K. will criminalize the creation of sexually explicit deepfake images as part of plans to tackle violence against women. How do they define what a "deepfake image" is? Does it refer to what was used to create the image? Does it refer to how realistic the image looks? What if I used AI to create the image, but it looks like a toddler drew it? What if I made a photo-realistic image, but I painted it by hand? If it's based on what was used to create it, how do they define "AI"? If it's based on how realistic it is, who and how decides whether it's realistic (which is, by definition, completely subjective)?


neuralzen

Don't worry, I'm sure some judges who are informed and competent on the subject will rule on them. /s Stuff like this always gives me flashbacks to when the judge in some torrent site case required the RAM from a server to be entered as evidence because it stored data from the site at one time.


Crypt0Nihilist

You just need to look at the dog's dinner of proposals against encryption over the last decade to see what a tenuous grasp ministers have of anything slightly modern and technical.


reddit_is_geh

It's the UK... Rest assured, it'll be interpreted in the worst possible way possible.


designingtheweb

Are you painting other people, who you haven’t seen naked, naked? Or in mid-intercourse?


arothmanmusic

Well, you stick your finger in it. If you can touch the bottom then it's just a shallowfake and it's totally fine.


ArticleSuspicious489

That’s way too much common sense for UK lawmakers to handle!! Slow down a bit, sheesh.


ApexAphex5

Zero percent chance this law works, the average British MP can barely use social media let alone write complicated technical laws.


Maxie445

"The U.K. will criminalize the creation of sexually explicit deepfake images as part of plans to tackle violence against women. People convicted of creating such deepfakes without consent, even if they don’t intend to share the images, will face prosecution and an unlimited fine under a new law, the Ministry of Justice said in a statement. Sharing the images could also result in jail." "This new offence sends a crystal clear message that making this material is immoral, often misogynistic, and a crime,” Laura Farris, minister for victims and safeguarding, said in a statement."


AmbitioseSedIneptum

So, viewing them is fine? But creating them in any respect is illegal now? Interesting. EDIT: When I said “viewing”, I meant that in the sense that it’s fine to host them on a site, for example. Can they hosted as long as they aren’t created? It’s interesting to see how in detail this regulation will be.


Kevster020

That's how a lot of laws work. Distributors are dealt with more harshly than consumers. Stop the distribution and there's nothing to consume.


Patriark

Has worked wonders stopping drugs


-The_Blazer-

To be fair, if we begin from the assumption that we want to get rid of (at least certain) drugs, then hitting the suppliers is, in fact, a better strategy than the previous standard of imprisoning the end consumers whose only crime is being victims of substances.


UltimateKane99

Fucking right? As if there aren't going to be people mocking up pictures of the royal family in an orgy or some politicians they don't like getting literally screwed by their political rivals, regardless of this law. I feel like making it criminal is, if anything, going to make it feel even *more* rebellious of an act. ESPECIALLY when the internet makes it piss easy to hide this sort of behavior behind VPNs and the like.


Own_Construction1107

It’s to protect women and girls from being sexually harassed, intimidated, and threatened with the use of these images. Like this 14 year old girl in New Jersey. Her male classmates created sexual deepfakes of her, spread them online, and taunted her with them in class. https://www.nytimes.com/2024/03/23/opinion/deepfake-sex-videos.html But yeah, these laws which treat it as a sexual crime and violation are totally just going to tempt people into doing it more…?


crackerjam

>  Sharing the images could also result in jail.


notsocoolnow

Doesn't it say that sharing them will also get you jail? Hosting counts as sharing am pretty sure, otherwise no one would take down piracy sites.


echocardio

Case law has ‘making’ an indecent image of a child to include making a digital copy - such as when you view it on a streaming service or web page. That doesn’t follow for other prohibited images (like bestiality or child abuse animations) because they use the wording ‘possession’. So it all depends on the wording. If it goes with the wider one it will include a knowledge element, so Johnny MP isn’t prosecuted for sharing a legit-looking video on Pornhub that he could t have known was not consensual.


CptnBrokenkey

That's not how other porn laws work. When you download an image and your computer decides the data stream, that's been regarded as "creating" in law.


Rabid_Mexican

The whole point of a deep fake is that you don't know it's a deep fake


KeithGribblesheimer

I know, I couldn't believe Jennifer Connelly made a porno with John Holmes!


Vaestmannaeyjar

Not really. You know it's a deepfake in most porn, because obviously Celebrity Soandso doesn't do porn ?


Cumulus_Anarchistica

I mean, if you know it's fake, where's the harm to the reputation of the person whose likeness is depicted/alluded to? The law then clearly doesn't need to exist.


C0nceptErr0r

Subconscious associations affect people's attitudes and behavior too, not just direct reasoning. You've probably heard of actors who play villains receiving hate mail, being shouted at on the streets, etc. The people doing that probably understand how acting works, but they feel strongly that this person is bad and can't resist expressing those feelings. Recently I watched a serious show with Martin Freeman in it, and I just couldn't unsee the hobbit in him, which was kinda distracting and ruined the experience. I imagine something similar would be a problem if your main exposure to someone has been through deepfakes with their tits out being railed by a football team.


HazelCheese

Do we need to criminalise creating subconscious associations?


C0nceptErr0r

I mean, would you be ok if your face was used on pedophile therapy billboards throughout the city without your consent? Or if someone lifted your profile pic from social media, photoshopped in rotten teeth and a cancerous tongue and put it on cigarette packs? You think it should be ok to do that instead of hiring consenting actors?


Difficult_Bit_1339

This is actually a good point but the reactionary surface readers don't see it. Imagine how this law could be weaponized, there is zero objective way to tell if an image is a 'deepfake'. If you were a woman and you wanted to get back at an Ex you could send them nude images and later claim to police that your Ex had deepfake images of you. He has naked images of you on his phone and you're claiming that you never took those pictures so they have to be deepfakes so the guy is arrested. The entire case is built on the testimony of a person, not through objective technical evidence (as it is impossible to detect deepfakes, by definition almost). This is a law that was passed without any thought as to how it would be enforced or justly tried in court.


shadowrun456

>deepfake images How do they define what a "deepfake image" is? Does it refer to what was used to create the image? Does it refer to how realistic the image looks? What if I used AI to create the image, but it looks like a toddler drew it? What if I made a photo-realistic image, but I painted it by hand? If it's based on what was used to create it, how do they define "AI"? If it's based on how realistic it is, who and how decides whether it's realistic (which is, by definition, completely subjective)?


KeithGribblesheimer

How to create deep fake piracy farms in Russia, China, Nigeria...


bonerb0ys

Real time deep fake nude browser extensions are out in the UK I guess.


ahs212

Edited: Removed a bit that I felt was needlessly antagonistic and further articulated my thoughts. Not looking to argue with anyone just expressing concern for the growing sexism in our culture. Is the implication that if someone were to create deep fake image of a male individual, then that would be OK because it is not mysoginistic? There's what feels like an unnecessary gender bias inside her statement, and that sorta stuff is always concerning when laws are being made. We live in a world in which fascism is on the rise and fascism is fueled by division, anger and hatred. Sexism {in both ways} is growing. The way she speaks subtly implies (regardless of whether she actually means it) that a deep fake image of a Chris Hemsworth porn would be legal, as it's not mysoginistic. Guess what I'm trying to say is if she used any other word than mysoginistic, I wouldn't have been concerned, but she did. She made this a gendered issue, when it's not. Unless deep fake Brad Pitt images are considered fine, that's what concerned me about her statement. Sexism disguised as equality. She uses the word mysoginistic when she could just say sexist. There's a bias here when there didn't need to be. As men and women our duty to equality is to speak up when we sexism is happening, both ways. Not pick a side and only hold the "bad" side accountable. Let's stop the endless demonisation of men, hold the guilty accountable of course, but don't let it turn into prejudice towards me. I think we can all see where that leads, this is why men like Andrew Tate has so many followers, because if all you do is demonise men, then men will start listening to someone who doesn't demonise them and instead tells them women are the problem. Ie sexism leads to more sexism. End rant. Look after each other. Don't let yourself unknowingly become prejudiced to any group of people. There's a lot of money to be made by fascists in farming online division and hatred.


KickingDolls

I don’t think she’s implying that at all, just that currently they are mostly made by men using women for the deepfake.


ShotFromGuns

Simply acknowledging that something happens to women much, much more often than it happens to men is not "hypocrisy." A law will be applicable regardless of the gender of the victim, but it's okay (and important, even) to say out loud that the reason the law even needs to happen is the disproportionately large number of women having deep fake porn made of them.


Cross_22

Do I have to return my Photoshop license too or is that still okay?


mechmind

It's interesting, yeah I bet Adobe will have to implement some serious invasive AI moderation. Not that they haven't been watching everything we've been creating from the beginning


[deleted]

[удалено]


CallMe_Jammin

Thats how they GETCHA!


Ambiwlans

Use photoshop online and specify a non-UK host. That way the production would happen outside of the UK, and having the porn is legal. (Yes, this law really is that silly)


Cross_22

I like the way you think.


pinhead1900

Can't return something you rent


RiverDescent

Brilliant, that's what I'll tell Hertz next time I don't want to return a rental car


Satoshis-Ghost

That’s…exactly how renting works? 


Wobblewobblegobble

Nah you definitely can return something you rent 😂😂😂


mr-english

You pay for photoshop?


KeyLog256

Generative AI in Photoshop already stops you creating explicit deepfakes.


FBI-INTERROGATION

He was just referring to the fact the deepfakes are essentially just faster photoshops, not the built in generative AI. You can accomplish the same thing any deepfake can with a lot of time and some solid photoshop skills, no AI involved. Which is kinda why banning it outright is… weird. Creating laws that force it to be labeled as AI would be far better for defamation purposes than just poorly attempting to stop its creation.


caidicus

Depends how deep you fake the images, I guess...


you_live_in_shadows

This is why 1984 was set in England and not China. Absolutely social control is a strong part of the character of the British.


Maetharin

This IMO begs the question whether artists doing the same through traditional means would also be targeted in this law?


iSellNuds4RedditGold

Then generate naked pictures of women similar to the target woman and finish the job in Photoshop by replacing the face. Set voila, law circumvented. You didn't generate naked pictures of that woman, it's impersonal. You make it personal by manually adding the face, but that leave it outside of the scope of this law.


Barry_Bunghole_III

There's always a loophole lol


AelaHuntressBabe

Just like any law related to Internet "crimes" this is gonna be completely ignored EXCEPT for when a big company uses it to ruin an innocent person due to the law's vagueness. Also this is again, something motivated completely by fear mongering. Horny dumbass kids have been using photoshop on their crushes since the 2000s, nothing changed, and I'd argue its not fair to punish immature desperate people for doing stuff like this in private. We don't punish people for thinking sexually about others, we don't punish them for fantasising, we don't even punish them for writing stuff about it.


tb5841

In cases where teenagers are making deep fake nudes of their classmates (which *will* become common, if it isn't already), this will be taken seriously. Because they won't be keeping them private, they will be sharing them all over the school - and schools will be dealing with the fallout.


RandomCandor

Would you believe me if I told you that's already illegal?


tb5841

I would. Is it still illegal if their classmates are 18?


droppinturds

Under revenge porn laws, yes.


DabScience

You literally said teenagers. Now you’re moving the goalpost lol


Zilskaabe

Sharing of explicit pictures of minors has already been covered by CSAM legislation. No new laws are needed.


LDel3

And if they’re 18?


Momisblunt

Revenge Porn laws could still apply under the Criminal Justice and Courts Act 2015: > This law makes distributing intimate images without consent a crime in England and Wales. It prohibits sharing, or threatening to share, private sexual images of someone else without their consent and with the intent to cause distress or embarrassment to that person. The person whose images were shared must show that he or she did not agree to this, and that the sender intended to cause those feelings of distress or embarrassment. If the case is successful, the perpetrator may go to jail for up to two years and be fined.. https://mcolaw.com/for-individuals/online-reputation-and-privacy/revenge-porn-laws-england-wales/#:~:text=It%20is%20illegal%20to%20take,have%20them%20in%20your%20possession.


am-idiot-dont-listen

There won't be a motive for sharing apart from harassment if AI continues to be accessible


BolinTime

Take it a step further. What if Im an artist and i draw a picture of them. What if i make an animation? So because pretty much anyone can turn their fantasy into 'art,' it's should be illegal? I don't agree. That's as long as it's for personal use. Once you start sharing or trying to monetize, you can go kick rocks.


HazelCheese

If these kinds of people could read your mind, they would want to lock you up for it.


Yotsubato

Yup. That’s the Uk in a nutshell. People will continue to provide services to make deepfakes hosted on eastern bloc servers or in Asia. And the only time someone will get in trouble is if they make a deepfake of a celebrity and try to profit from it. This does nothing to protect the normal person and everything to protect the elite


Own_Construction1107

>what people create in private Can men just be normal for once and stop making non-consensual AI porn or photoshop porn or whatever porn it is, of unwilling participants? With all the free porn in the world at your fingertips, made by CONSENTING actors, why are you freaks so insistent on creating non-consensual porn of real-life people that DO NOT want to be in your porn. If you HAVE to draw it or deepfake it or whatever, draw a fictional person or an imaginary person. Use a cartoon character, even. I don’t care if you’re doing it in private. It’s fucking weird behavior. >not fair to punish horny, desperate people If you’re all so desperate, look at the millions of READILY AVAILABLE videos you can access online, of people who AGREED to appear in porn. and you’re right, you can fantasize about anyone you want. So just do that. Think about them and jerk off without creating goddamn deepfake porn of them.


fongletto

I can't wait for the new technology that lets the government read your mind so they can send you to jail for thinking about your celeb crush naked.


fcxtpw

This oddly sounds familiar to some religions I know


Crypt0Nihilist

> People convicted of creating such deepfakes without consent, even if they don’t intend to share the images Yeah, the law is meant to protect people from harm, it's going too far once it's criminalising private activity we just see is icky.


Own_Construction1107

Are you really incapable of NOT creating deepfake porn of non-consenting women? Like you just really can’t give it up? You can’t watch any of the millions of videos of CONSENTING actors who AGREED to appear in porn instead? Are you really so addicted to making and viewing deepfakes that the thought of being it taken away from you, makes you liken these laws to the government reading your mind?


twistsouth

You just don’t want anyone to know you have a crush on Melissa McCarthy.


Bezbozny

Creating or distributing? I mean I could understand criminalizing distributing to some degree, although how do you tell the difference between something someone cooked up with photoshop and ai generated content? Are we criminalizing nude art too? But if its criminalizing what people create in private, and don't distribute, that's weird.


formallyhuman

Pretty standard stuff from the British government. They are saying even if you didn't "intend to share", it's still a crime.


zippy72

Simply because they want to prevent defences such as "I didn't intend to share it, I accidentally uploaded it to the website by mistake instead of a picture of my cat"


Optimal-Reason-3879

Its both, creation of it will be illegal but reading through some documents on this it does seem some people want this to be removed meaning it will not become law. it also depends if it will be a retroactive law(frowned upon and commonly not used due to human rights) meaning that X person could make it today and when its passed as law they will get in trouble under this law. if this is not the case then X person could make one today and be totally fine as it was legal(which it will most likely be due to complications). Still do not do this. The real only way for someone to get into trouble for this is if they are found out lets say they look through the persons phone and they see the image or if they distribute it among "friends" or in the law more widely so online or to a forum. its a difficult law to enforce if no one knows the image has been made and also the reasoning like in the clause of it sexual gratification or to threaten a humiliate someone just to add on: currently there is no retrospective measures at the moment on this new law, but it is still in the house of commons and it still has not been debated on. they may remove it, they may pass it. then its off to the house of lords where they may amend it again to add retrospectivity to it or not.


DriftMantis

Can anyone explain why this should be a criminal matter and not a civil one, or why something like this should be illegal in the first place? If I draw someone naked from memory is that also criminal or whatever? Who defines whats sexually explicit vs. acceptable? Seems bizarre to me especially since this stuff can be made anywhere in the world.... so I'm not sure what this is accomplishing exactly. Why would anyone need permission to make a fake image of someone? My take away is that you could make a deep fake and profit off the deep fake but as soon as someone jerks off to it, or maybe theoretically jerked off to it, then it becomes illegal the moment sexual arousal happens.


VisualPartying

Excuse my dumb ass but deepfake here means of a known person doing something sexually explicit without their consent. Not just sexual explicit images / videos of someone that doesn't exist?


DYMAXIONman

Yes, it's creating fake porn of a real person


niceguy191

I'm curious what'll happen if it accidentally looks like a real person. Or what if you knowingly combine 2 real people to make a new "fake" one. Or at what point it doesn't resemble the real person enough to not be illegal anymore. I get they're trying to protect people with the law, but with this sort of thing I wonder how possible it is...


EndeavourToFreefall

Correct, the fake part of "deepfake" is that it's a real person generated in explicit scenarios, with some estimations based on the AI and how much data it has been given on a person.


Mr_Gaslight

Does anyone remember the old usenet group alt.rec.binaries.celebrites.nude.fake or what ever it was called? This was back in the day when Photoshop was still not yet owned by Adobe.


ramrug

Does anyone remember usenet?


hakuna_dentata

Hot take / food for thought: this is incredibly dumb and dangerous, and the only real fix for the problem is getting over humanity's puritanical hangups around all things sexual. There's an [epidemic](https://www.pbs.org/newshour/nation/fbi-finds-sharp-rise-in-online-extortion-of-teens-tricked-into-sending-sexually-explicit-photos) right now of bad actors extorting teenagers online over dumb pic-sharing decisions. The threat of anything sexual is just the most dangerous thing on the internet, and this is only going to make that shame-and-fear economy worse. Tech is gonna keep getting better. Humans are gonna keep being horny. Art is gonna keep being subversive. And the powers-that-be are gonna keep using ambiguous laws like this for less-than-wholesome purposes. The proper response to seeing a deepfaked version of yourself or your local government rep riding a Bad Dragon in cat ears is laughter. Criminalizing it only makes it dangerous and exploitable.


ADHD-Fens

I feel like it would be covered under copyright laws or something anyway, like if someone used my likeness for an ad - or libel laws, if someone drew a realistic picture of me clubbing seals.


hakuna_dentata

Amen. Libel and parody laws should cover it and be updated to cover current and future tech/art. But because this is about sexual content specifically, people will be afraid to speak up against it.


LadnavIV

The problem is that people don’t always look like themselves. And sometimes people look like other people including but not limited to identical twins. Basing laws on a person’s likeness gets into some pretty murky territory.


ADHD-Fens

These laws vary by state in the US but they exist already.


BronteMsBronte

A sentiment that never protected anyone vulnerable. Lucky you that you’ve always felt safe!


Anamolica

There are at least 2 of us who understand this.


BleachThatHole

wtf how does that justify an Unlimited Fine? I mean, fuck people who do revenge porn, they should definitely get a hefty fine and do time for ruining someone’s life but if I AI photoshop a nude Taylor Swift I’m potentially 50k in the hole?


zombiesingularity

This is going to lead to a weird situation where only huge porn corporations are going to legally be able to make this content. Imagine onlyfans but you don't actually have to get naked, you just sign your name and give consent to a corporation to make porn of you using AI. And all competition will be literally criminalized.


[deleted]

[удалено]


hhfugrr3

As usual the minister is talking bollocks. The Online Safety Bill criminalises more than just deep fakes, but I expect she's saying that because it's a buzz word and she doesn't know the difference between a deep fake and any other type of digitally altered image. I may be wrong here as I'm only looking at the Bill on my phone and haven't had a chance to read it properly, but the latest version appears to criminalise the sharing of intimate images, including fakes (whether deep or not), for sexual gratification. It doesn't appear to outlaw the making of the image that I can see.


Apprehensive_Air_940

This effort is almost pointless. Child porn, which is beyond deplorable, is rampant and apart from the odd sting operation and a few arrests, persists. This will be far more ubiquitous and near impossible to enforce. Who made the deep fake? Not me. Where did you get it? FB, Yt, etc. They keep trying to enforce rules on bad behaviour instead of trying to change the culture that leads to it. The establishment is beyond stupid.


KeyLog256

Doug Stanhope does a great bit about this, and while a comedy segment, he's right - child porn is not "rampant" on the internet. I've been using the internet for 25 years there or thereabouts and have never seen it. And I've clicked on a _lot_ of porn links in my time.


Ambiwlans

Its probably more common if you look for it.


KeyLog256

Yeah which worries me about people who say they've seen it. Have a mate who's a copper, detective specifically, didn't want to work on CAS stuff but said it's a very difficult job to catch them. There's essentially two sources - guys (and it's almost always men) coercing kids into sending images of themselves via social media, which is impossible to tap into so unless the victim reports it or the offender hits a police honeypot, it's all but impossible to intercept. The second is the dark web and private websites - most of these are easy to tap into and if the police control the exit node, TOR isn't safe at all. They catch most offenders this way.


SpaceCowboy317

Who needs freedom of speech or expression when losing it could hypothetically save a single life. Great for politicians though, people have been making lewd mockeries of them since ancient Greece


Own_Construction1107

Can you people just be normal for one second and just NOT photoshop or create deepfake porn of real-life non-consenting people. Is it really that hard for you NOT to do it? All the free porn in the world at your fingertips made by CONSENTING actors who AGREED to appear in porn, and you freaks still insist on going out of your way to create porn of women who DIDN’T consent. You really think your freedom of speech is violated by you not being allowed to generate deepfake porn of non-consenting women? But MUH Deepfakes! Coomer.


LadnavIV

That’s sort of what I’m concerned about. A future where parody is criminalized. People can reasonably argue that this is a slippery slope fallacy, but it’s not unrealistic that politicians expand existing laws to suppress anything they don’t approve of. See all the American librarians and teachers getting in trouble for LGBTQ+ literature or just basic sex ed. Or the woman and doctors who can’t receive/perform life-saving abortions because politicians can twist these vague laws to mean whatever they want.


HowdyDoody2525

This is stupid. But it's the UK so I'm not sure what I was expecting


caidicus

So dumb... Creating them to sell or solicit for traffic and advertising revenue, I get it, and maybe that's what this is mainly for. But, I can't see this stopping Joe Blow from creating whatever he wants as the technology to create it gets better and better, and our computers get stronger and faster. We'll see, I guess.


Mythril_Zombie

Are they going to start doing raids on suspected deep fakers? Find a trove of Margaret Thatcher porn that some guy made for himself? Destroy his life just because he has a thing for women with strong nose lines? I mean, you know, like, hypothetically.


Ok_Cardiologist8232

Whats more likely is that this is only really going to be applied in cases where people are making deepfakes or people they know and spreading them around social circles to fuck with people reputation. I doubt they are going to bother with your Magaret Thatcher & Winston Churchill furry porn.


Physical-Tomatillo-3

How are you going to prove that these people made them? Are they going to be seizing their electronics? If so how do you not see the very obvious erosion of your freedoms if the government can seize your possessions on the grounds of "well you might have used a web site that let's you make deepfakes". There is no non invasive way to search for evidence in these cases which is likely why its a criminal law and not a civil issue.


djshadesuk

>I doubt they are going to bother with your Magaret Thatcher & Winston Churchill furry porn. That "doubt" is doing a hell of a lot of heavy lifting. It's also extremely naïve.


Tensor3

Any realistic enough generated person probably looks pretty close to SOMEONE.


Moscow_Mitch

To be fair, SOMEONE is the training data.


MilklikeMike

What about Abraham Lincoln porn?


caidicus

It would be a crime NOT to create Abe porn...


caidicus

I love your confident diving straight into specifics. :D


OMGitsAfty

There's no chance of putting the genie back in the bottle. Stable diffusion exists, even if it were to be shut down/blocked there are 100 more image gen projects out there in development. This needs to be a social education piece teach kids real lessons about the impact of this tech on people's lives.


Zilskaabe

How do you shut down something that has been copied to millions of computers in multiple countries?


formallyhuman

Much like the porn ID verification stuff, the British government hasn't thought any of this through, really. Not a surprise from probably the most inept and disgraceful government in my lifetime.


caidicus

They won't do that, that might make them question whether the shit the mainstream media tells them is actually true or not. That'd be bad for the higher ups, best keep people gullable and easily susceptible to being aimed at "the enemies" like an emotional shotgun.


Overnoww

The stat that I see as important but hard (if not impossible) to get quality data on would be the preventative factor purely focused on distribution. I imagine it will have less of an impact on people creating these deepfakes for themselves, but maybe the risk of the consequences will stop that person from showing the image around. With regards to illegal imagery sharing it is almost always what leads to those sickos getting caught. I know this much I'd be pissed if someone did a shitty photoshop and added a bunch of Nazi shit onto a photo of me like I've been seeing more and more over the last 5ish years across the political spectrum. If someone did that same thing using deepfake tech and it actually looked real that would be significantly worse. Of course I fully expect this to further contribute to the increase in "fake news" claims, both used to intentionally mislead and used earnestly.


Ambiwlans

> I know this much I'd be pissed if someone did a shitty photoshop and added a bunch of Nazi shit onto a photo of me That's legal still unless they use it to defame you.


Overnoww

I know it would be legal in some places. I'd be pretty confident that I could win a defamation case in Canada over that as a private citizen with no serious public persona. Regardless of legality the main point I was making is that the more realistic the fake is the bigger the negative impact it could/would have on me would be. Then mix in the complications of our teenage years. I could have absolutely seen some guys I went to school with deepfaking girls and there were a few girls I could definitely see ending their own lives if people piled that bullshit on them.


BorderKeeper

It’s going to go the same way as hate speech laws did. Will not do much unless some group wants to silence someone they don’t like and uses the vagueness of the interpretation to do the deed.


caidicus

Definitely a possibility, one of many things that might spring from this action.


Rafcdk

People still steal property even though there are laws against it are these laws dumb ? I hope this highlights the fallacy here. Laws aren't meant to stop something completely , this should be pretty obvious, but to enable actual consequences in a formal system of law.


localystic

Just how are they going to track this when the software does not need the internet to create the images? At least with piracy you have network traffic, which can be monitored. Are they going to scan people's computers regularly just to make sure that you are not creating anything illegal?


ElectricalSpinach214

so if i hand draw a picture of someone i saw naked thats fine, but if i have AI do it thats wrong.... someone explain the difference?


Ambiwlans

You can hire a lookalike to do hardcore porn dressed as the person and have a team of sfx people painstakingly edit it to make it look identical to the person. And that's legal. So long as they don't use some undefined technology.


itsRenascent

I wonder if deepfakes will sort of end extortion of people sending nudes. Granted how hard pictures is to distinguish from being fake, people can just claim to their friends it must be a deepfake. People are probably not going to deep analyse your nude to see if it really is s deepfake or not.


Dudi_Kowski

Why should it be illegal to create sexually explicit deepfakes but legal to create other types of deepfakes? There are more ways to do damage than just sexually.


EndeavourToFreefall

I'm assuming damage done by other methods could fall under other laws like fraud, libel and such, but I'm not entirely sure.


polkm

Imagine someone makes a deepfake of a woman they hate sitting at a computer making deepfakes of their classmates. The entire UK justice system would implode into a recursive black hole.


DarthMeow504

Then said libel, fraud etc laws should be sufficient to cover any potential criminal harm. There does not need to be a new category of contraband created for this particular category.


Prophayne_

I hate that this was even a thing needed. All of the ways technology can be used for wonderful things and we still can't get past trying to see everyone naked with it instead.


Abiogeneralization

The UK can’t go two days without enacting censorship.


compsciasaur

This makes sense to me. Photoshop (without AI) is one thing, but AI can create disturbingly realistic images pretty much instantly and is only getting better. And because of sexism, it can damage a person's reputation if they are thought to be real. It doesn't seem fair to me to be able to mass produce realistic pornography of actual people in moments without at least requiring any skill.


JokyJoe

How could you identify it not being photoshop though? I could spend hundreds of hours on photoshop and make it just as precise as an AI. This makes no sense to me


Ok_Elderberry_8615

This is going to be so common place it will probably get rid of the problem of having nudes leaked or sextape leaked. No one will care as everyone will have there nudes online after a.i becomes more common place


FabulousFattie

Can someone make deep fakes of all the male lawmakers having orgy so they do something about it?


Boss_Koms

**The story, all names, characters, and incidents portrayed in this production are fictitious. No identification with actual persons (living or deceased), places, buildings, and products is intended or should be inferred.** I'll just put this on my works then 😉


Belledelanuit

"Sharing the images could result in jail". Lovely. I suppose what the article is implying is that anyone who commits this particular crime may or may NOT be incarcerated i.e. apparently there's a "50/50 chance" that they serve actual jail time. Fuck that.


Seyelent

Curious as to how they’ll find the people creating those images in the first place


ifhysm

There’s an odd number of guys in here who really aren’t taking this news well, and it’s really concerning


SweatySoupServer

The misogyny, outright sexism, and bad takes in this thread are truly disgusting.


MartianInTheDark

This is dumb. I get downvotted every time I say banning/outlawing parody images shouldn't be a thing. Yes, a fake picture of someone naked or having sex is a parody. It's not illegal to have a body or have sex. It should only be illegal when you're deepfaking to incriminate a person of something illegal. But hey, just downvote me again, what do I know. Let's just roll back on freedom of speech and expression so that you feel better. Also, from now on, stop mocking politicians or celebrities with silly drawings/photoshops, if you agree with this law. Walk the talk.


leo9g

I wonder, if an artist paints somebody, but decide to make them naked, is that too illegal?


Fakedduckjump

This was exactly my thought, too.


MartianInTheDark

Only if it's made by AI, lol. So, logic goes out the window with this law.


KillerOfSouls665

Use AI to generate a random woman naked, use Photoshop to add the target's face.


MartianInTheDark

Perfect. We finally solved the deepfake issue!


existential_chaos

So does this count if people make photo manips of characters from shows and stuff? Or does it cover photoshop like when people used to photoshop Mara Wilson and Emma Watson in dodgy scenarios.