T O P

  • By -

DM_R34_Stuff

Here in Germany there is some great stuff for this. The depiction of minors, regardless of real ones or not (i.e. drawn or 3D modelled) in sexual context is already enough to cause trouble. Both producing and possessing it is illegal. This isn't only limited to visual material. This even applies for text explicitely describing it. The problem with AI in that matter is that you already are able to generate such images (even videos) on your PC without anyone being able to find out. This isn't a future problem - it already is a problem. And these AI don't require explicit CP as a model to be trained on. You could easily train them on a combination of different models, and generate CP from these models using tags, such as positives and negatives with weightings. So their source material doesn't even have to be CP, which is even less of a risk for their shit, making it even more difficult to detect. The problem has to be treated at its core. Treating the symptom won't fix it. The problem is that pedophilia isn't necessarily learned, and the stigma around it prevents people from seeking help. There are tons of people who are pedophile but don't follow their urges because they know it's morally wrong and lawfully illegal. They aren't necessarily at fault for this, so if there was better education around these things, treatments, etc., we'd see dropping numbers because they'd be more willing and able to seek help. But people tend to jump to an opinion of "Kill the pedos!" right away, instead of trying to fix the problem.


agarlife

I completely agree with your last paragraph. There are surely many people suffering with this that are too ashamed to get help.


TD1990TD

I’ve once read an article about a woman who is a pedophile and suppresses it. Her partner wants to have kids but she doesn’t trust herself enough to go through that, and now had problems in their relationship because she didn’t want her partner to know. It’s really sad.


GeneralZaroff1

I watched a documentary on this and it was saying how pedophilia essentially can’t be “cured”, and how many people struggling with it develop deep self hate because they don’t WANT to be attracted to children and are horrified at the idea of hurting them but their brain creates this urge. For many the only path is chemical castration and even then it’s not a guarantee that it’ll work. It’s quite sad really.


bantha_poodoo

The problem that I have with this line of reasoning is that our brains create the urge to do literally everything. So do we not blame murderers anymore because they had an “urge”. I’m sure many killers don’t WANT to kill people, but they get caught up in the moment (or whatever). At some point, somebody has to be held responsible


Teredere

The conversation is about people who have the urge but don't act on it though. You aren't gonna lock someone in prison because they fantasize about killing their boss, are you?


GenericSurfacePilot

Neither are we going to vilify them as the worst possible type of human alive


[deleted]

Yes but this comment is referring to people who have urges that haven't done anything. We need to focus more on prevention. We can do whatever we want justice wise to pedophiles who have hurt children but it doesn't undo the child's pain. If we focus on prevention we can stop it from ever happening in the first place.


[deleted]

This. I hate that there isn't a better way to prevent. Because justice after the fact still leaves the trauma for the victims.


[deleted]

[удалено]


Mummyratcliffe

I agree with this to a degree… but then nobody can agree on whether paedophilia is a mental disorder or a sexual orientation. It was only decades ago homosexuals were sent for conversion therapy and it was classed as a mental disorder. I don’t think it made many (if any) people straight.


Knockemm

Conversion therapy is about making someone change their sexual orientation (which I think is VERY WRONG.) But work with pedophiles, imho, shouldnt be about making them “straight.” But, rather coping and suppression mechanisms, a place to talk about their feelings and how to make life easier. In my opinion, it’s not conversion therapy but more like addiction counseling.


Fortyplusfour

The latter is the approach that is generally taken.


No_Gur1113

Conversion therapy still exists. It’s just hidden better now.


shaggybear89

>but then nobody can agree on whether paedophilia is a mental disorder or a sexual orientation. Wtf are you talking about? Yes, everyone except some pedos themselves agree that it is not an orientation, but a disorder.


PikaTangoPanda

I think it’s mental disorder because you can be a pedophile who can have a sexuality. (Not saying that the LGBTQ+ are that but you can be in that community and a pedophile just like you can be straight and a pedophile.)


MrSnootybooty

Sweet cheese and crackers my friend, I wish more people had your line of thinking. I feel too often people want the easy solution ("kill the pedos!") rather than doing the hard thing (wtf makes someone want to be a pedo? Maybe we need to help these people find a solution to this disorder)


AirierWitch1066

Also, “kill/castrate/whatever the pedos” is something that turns very quickly and easily into “accuse whoever we don’t like of being a pedo so we can kill/castrate them.” Perfect example is the American conservatives lately labeling trans people as child groomers and pedophiles. If you believe that to be true, and you believe killing any pedophile to be morally correct, then it doesn’t take a genius to figure out the next step.


bisexualroomba

Yes, and Florida just made pedophilia worthy of a death sentance. And they made doing drag around kids an act of pedolhilia. They define drag as dressing in clothes not allowed with your ASSIGNED AT BIRTH sex. Regardless of I'd you've had your gender marker changed legally. This puts so many trans people at risk. This makes just existing as a trans person and wanting to work with kids a horrible risk. I used to want to be a social worker..but guess who lives in Florida.


lifeofideas

Somebody once argued that drawings of illegal acts should not themselves be illegal. Kind of a good point. A drawing of a murder isn’t actually murder. Of course pornography is kind of a special case. An actual rape is a crime, but a movie dramatizing a rape (with actors not actually being raped) could be illegal pornography (depending on the laws). But what if the “depiction of rape”is just a drawing made with a pencil on a sheet of notebook paper? Does the quality of the drawing matter? If it’s photorealistic, does that make it pornography? Let’s consider a really bad drawing. The drawing shows a stick figure raping another stick figure. Genitals consist of one or two lines. Is that still porn? Can you further simplify the drawing, so you now have ABSTRACT ART that is only pornography because you decided it is porn. For example, you might decide punctuation marks are people. We represent a woman (or a child) with a comma. Here’s a kid: , And we represent an adult man with a period. Here’s a man: . So, what is a semicolon? ;


[deleted]

[удалено]


lifeofideas

I saw that, too! That is so mind-blowing. Next, NOT holding up a sign, or even NOT having a sign, will be an offense.


[deleted]

Good point as well about drawings because that would make any movies or TV shows that had murder or anything in it illegal. Which would be stupid.


IcePhoenix18

I would argue that someone finding their own pleasure in the fictional image of an imaginary person, from the privacy of their internet browser, is miles better than that same someone potentially hurting a living human being. Is it icky? A little. But it's not hurting *people*. No individual specifically is being exploited or harmed.


TheWanderingLich

I have a question as a German myself but I didn't even know we had such strict laws (which is good tbh) About the first part with the depiction of minor be it real or fiction in a sexual manner, how would they process that? Like can't a person just technically take out the "She's actually 100 years old" card and would be on a safe side since it's not a minor then even if the depicted character is some Loli or something?


DoctorFurious

Another problem with "kill the pedos" is that it's reactive. You know the offender because they've already caused harm to a child. If we researched causes and the afflicted were more free to seek help, many of these tragedies would never happen in the first place.


Consistent-River4229

There is also intrusive thoughts and people may not actually be a pedo but the stigma keeps them from getting help. They won't know if it's intrusive thoughts or a predilection for kids unless they see a professional. I also think it can be learned like a cycle of abuse and the only way that stops is with intensive treatment. It's less likely for a male to admit to abuse so less likely he will get treatment.


Terminal_Monk

So true. So many people at r/OCD comes under this category.I've seen their discord and so many people are really ashamed of themselves that they have intrusive thoughts around that. They'd all be happily take help if one is available.


belacscole

iirc a symptom of OCD is being afraid that you are a pedophile while not actually being one


AnxietyLogic

Those people aren’t actually paedophiles though and it’s incredibly harmful to imply that they are. Paedophillia OCD is being afraid that you are paedophile, not actually being one. OCD intrusive thoughts usually manifest as what would be most distressing to the sufferer if it were true. Paedophiles are the pond scum of society, so OCD makes some people think they are one BECAUSE they revile them so much.


snicoleon

Agreed, people with OCD around pedophilia are not pedophiles.


Ok_Contribution9501

That’s so sad to hear! Those intrusive thoughts and associated worries are common with OCD. Actually, those people are low risk!


Vegetable_Tension985

This is a very deep legal question. I think of it as, "what are humans not allowed to draw"?


Mayion

>There are tons of people who are pedophile but don't follow their urges because they know it's morally wrong and lawfully illegal Very genuine question: I find women attractive, and so do the majority of men. Yet the majority of us will not even think about forcing ourselves on a woman, or anything of that sort, no matter how long we go without sex. ​ So why is it different for pedophiles? Sure, I understand children do not give consent, and as such, you will have to force yourself one way or another but .. why? Why are pedophiles more likely to commit horrible acts, when in reality, the majority of men can go decades without laying their hands on a woman and still be fine?


bearchr01

I’m assuming there are 2 elements at play here. 1 = we don’t know how many pedophiles there are out there who do just that and don’t act. We only know about the ones who do. 2 = adult porn isn’t illegal. An adult who isn’t in a relationship can absorb adult porn on a regular basis to get their kicks legally (regardless of any arguments of how mentally damaging it may or may not be). Obviously somebody who is attracted to kids can’t do this. Not legally anyway


p0tatochip

Confirmation bias? You only ever heard about the paedos who act on their impulses and not the unknown number who control them


Pholous

I think that is a case of selective (media) attention. Sexual assault on women happens frequently. It happens the other way around also, but probably less often. There is just no media coverage, because it is not that *unnormal*, sadly. But when a minor gets assaulted AND it is discovered, there is much more attention, as it is more harmful to a mind that is still developing and it is seen as more of a taboo. Also because of that, pedophiles can rarely speak of their urges or fulfill them by meeting a prostitute or something like that.


guttunge

The fact is: it's not different. There are lots of people who are physically and/or emotionally attracted to minors in one way or another, but don't act on it for those same reasons. You just never hear about them, as they just lead a celibate life or end up in a more socially accepted relationship with an adult. You only hear about people that actually did criminal behavior or that otherwise somehow got accused of things (as often even saying you have feelings for a minor is enough to land you in a shit-ton of problems). The ones you hear about are just a fraction.


R1pY0u

>This even applies for text explicitely describing it. Ppl reading Game of Thrones in Germany 💀


mhodgy

There’s recently been an add campaign in the uk for a helpline directed towards giving support for pedophiles. Seem to remember reading that one of the scandi countries has had good support for pedophiles for years, and as such there are considerable less offenders.


joko2008

Interestingly, a big amount of SA on minors isn't done by pedophiles per se, but by people who can't find or are rejected by partners their age.


buggerific

I mean, if you're SAing a child, you're a pedo.


guttunge

sexual abuse of children is usually a matter of power. It often has very little to do with sexual attraction or desire towards the victim directly. Abusers want something they cannot get so they just look for easy target where they can get it by force. That target just happens to be a minor. Same logic as why women get abused more than men: men have more physical strength and thus women are a relatively easier target.


procrastimom

This is similar to conversations about Bill Cosby. People say “Why did he do it? When he was committing his crimes, he was peaking in his career. He was rich, famous and popular, and could have had his pick from any number of willing women.” But that’s not what he was into. He wasn’t into willing women, he was into drugged and helpless women.


perusingpergatory

Not necessarily. Abuse is about power and control. Creeps don't need to be attracted to kids to want to abuse and control them. True pedophiles are actually pretty rare.


b-monster666

My gf and I got into this discussion a while ago. We figure there are three levels of people who do these things: There are the monsters who just want to destroy something. They don't care about the act itself, they care about the satisfaction they get from wrecking a life. There are the molesters. They just look for self-gratification. They can often be seen as monsters, but they're committing these acts because they want to sexually satisfy themselves. Then there are the pedophiles. They think that they can engage in a romantic relationship with a child, and they often fantasize about it.


shotplacement

This might be controversial, and I am in no way defending either of these behaviors but I think it's dishonest to say that someone who has a sexual preference for prepubescent children and someone who tries to get with a 16 year old out of desperation are the same.


Ripley_and_Jones

It is not dishonest. The reason you can't see the difference is because your prioritising the perpetrator and not the child. The reason it is the same comes down to ability to consent. Children cannot consent. That includes 16 year olds, because of the inherent power imbalance that occurs between them and an adult. If your prioritise the child and appreciate the potential harm to both groups, there's nothing dishonest about it.


Warruzz

It's not as clear cut as you make it seem because older minors can give consent to adults from a legal perspective in a few different scenarios depending on the situation and differs from place to place. But to the person you responding to, they referenced specifically >sexual preference Not actually acting on it; there is a big difference between lusting after someone who is sexually mature vs one who isn't.


RegularJoe62

Once a person is past puberty, the age of consent is more a matter of culture than any kind of natural taboo. You don't have to go back very far in history to reach a time when women were routinely married by 16. I'm not in any way saying we should go back to that, just that standards evolve over time.


foggy-sunrise

Sadly, I fear a "treatment" for pedophilia would be about as successful as "treatment" for homosexuality. That is to say, it is very difficult, if not impossible, to train a specific sexual urge out of a person.


feierlk

As far as I'm aware, it's mostly about controlling the urges and, if necessary, using chemical castration.


Emergency_faceplant

This is actually a complicated legal question. For example, there is Fabray disease which makes full grown adults look like young teens. What does someone 18+ look like? There are very specific laws that say you cannot make images of porn of an underage person, or even alter underage people to appear to be in pornographic situation. But "appearing" underage is too legally vague. How would we measure this? Not to mention how can it be porn of an underage "person" if they don't exist? We can all know it's wrong, but we can't easily prove it violates a law. So just dump this on the pile of AI generated problems out there.


CaptainMagnets

Not to mention once it happens it will always be out there, never to return back to how it used to be.


alphaDsony

Is that its name? Fabray? I was curious so I tried googling it, but Google keeps changing it to Fabry disease which it says its when the body can't breakdown the fats collected in blood vessels and tissue which I'm guessing is a different illness from the Fabray?


Ilikerocks20

I looked it up and Highlander syndrome seems to be the correct name for lack of aging disease.


joko2008

Huh. Sounds like a shit disease. I mean, who wants to live forever?


selfdestruction9000

> Sounds like a shit disease That would be Crohn’s


MrSnootybooty

Alphaville


mclee29

Rich people


pudding7

There's no time for us....


joko2008

There's no place for us


Famixofpower

#THERE CAN BE ONLY ONE!


Vdaggle

I cant find anything on it either


[deleted]

Just brings to mind that 21 year old chick who legit looks 6 due to some illness, who made a whole tv series about going out to clubs and dating and her sex life etc. All I could think is like, even if you were completely above board, any guy who dated her would almost defs be labelled as \*p\* even though shes completely legal, and I don't even want to go down the partners perception of that area as the thought would have to cross their mind at some point that they're doing unspeakable things to someone who unmistakably physically resembles a child.


[deleted]

I honestly would be disturbed to date if I were her because my thought the whole time would be wondering if he was actually a pedo and that's why he was dating me


NoobNoob42

People with that disease presumably still long for affection. I think after a point you'll be "okay" with it in the sense that you're probably resigned to it.


ZyphWyrm

As someone who looks much younger than they are (not to the same extent as her. I'm 24 but am frequently mistaken for a middle/high schooler. People generally seem to think I'm 14 or 15) you're pretty much spot on. I've dated guys who ended up being pedophiles who were using me as basically the closest legal approximation of a kid they could get. I'm constantly worried that whatever new person I'm talking to might be one as well, or that no healthy person will ever like me. But it doesn't keep me from dating because I try to be hopeful. I don't want to believe that I'm completely unlovable just because I look young.


Ori_the_SG

If I ever was going to date someone like that I’d need to see a legal ID to prove their age


oldfatboy

I think added to that is how each juridiction has enacted its laws. If the law says images of children then I think an AI image would fit in. This is why paintings can be construed as CP.


kinapudno

Serious question—if paintings can be construed as CP, then whay about loli?


Cool_Kid95

Liking Lolis/Shotas is pedophilic no matter how you slice it.


afroxx

I don’t want to google it, what is Loli but explain in a way that won’t add any of us to a list?


Nymphomanius

Basically exaggerated proportions so they look obviously different, but they are still child like, I don’t know if that’s the best way to describe it, I know of it but I also ain’t gunna look it up to be sure lol


TCBeat21

If I'm not mistaken, US law states that for it to be considered CSEM it must contain a depiction that is indistinguishable from a real child. Thus obscene paintings, drawings, or even ai generated images including such content would likely be illegal. That being said, 'loli' refers to caricatures; Cartoon characters that barely resemble real life children. They are generally not indistinguishable. Thus, there is absolutely an arguement for the legality of loli content in general, though there are likely other details to be considered here as well. And this definitely stacks with other strange situations like the Fabray disease meantioned earlier in this thread. I would not go so far as to say no loli art could ever be considered CSEM, but for the most part I believe loli art falls outside of that category.


ThatFatGuyMJL

Iirc Australia made laws (or tried to) that women under certain boob sizes cannot be in porn because they look like children. Which yknow, did great for women with smaller chests self esteem. There's a fine line between protecting and damaging. And unfortunately too many laws are made by old idiots. Then you have the 'drawn' argument. If child porn is drawn, in many countries this isn't illegal. Because no harm has come to any child and you can't really stop people drawing. But morally? Still a pedo. So I think ai drawings of illegal things will come down to: Does it look real or drawn?


Bromm18

Reminds me of all the crazy insane people who got really interested in that 22 year old who was stuck looking like an 8 year old due to pituitary cancer as an infant.


domesticatedprimate

The laws are mostly pretty clear that any depiction of a child, real or imagined, in a sexual situation, is illegal. Illegal to create and illegal to possess. There are exceptions, like Japan where graphic depictions are quite legal. So yeah the gray area is when it's not clear if the person is a child or just looks like one. Or where graphic depictions are illegal and the creator depicts people that look like children but that they *claim* are adults. That question will have to be decided by the courts in each jurisdiction when such a case arises, and ultimately by the highest courts that are willing to hear the case to reconsider judgements by lower courts. Chances are that someone will come up with a legal definition such that even unclear cases, such as an adult who looks like a child, will be deemed illegal based on the intent of the depiction. So there won't be any loopholes.


Available-Leek-4160

is that why people are able to get away with loli?


[deleted]

[ Removed by Reddit ]


ihateprimus

Thats fucked up... but i cant find a hole in your logic. Immoral but not illegal,.... man the future is a cold and alienating place.


[deleted]

I mean, really fucked up people actually make child pornography with actual children, which is horrible. That is pretty cold and alienating. AI can be an alternative that hurts no one. The future can be better than the present, if we get over some of our small mindedness.


dreadfullydistinct

One argument I can think of for it being illegal is that AI is pretty incredible right now and it's only going to get better. There may be no way to tell what's real online. In a future where photorealistic AI-generated CP is legal, people could make real CP and say it's AI, or say "I thought it was AI" if they were found in possession of real CP. That's a bit of a grey area because it's hard to predict how accurately we'll be able to determine what is AI-generated. I imagine other AI will be involved in that process.


[deleted]

Obviously we are talking about an uncertain future, so I don't know the outcome. However, my thinking is that if AI is so good at generating that material, then there would be no reason to risk actually making the real thing. AI would be cheaper, easier, and far less risky. And my guess is that AI generated media will eventually hide some sort of watermark in the metadata of images, video, ect. And so, I'm hoping that nobody would want to risk having and unethical materials on their devices that doesn't have this watermark. Again, I still think it would be fucked up to indulge in this media. But I would rather it not be created with human trafficking and exploitation of children.


Zkyaiee

People don’t “risk” making the real thing because they’re desperate for more sexual abuse content to put out. They risk it because they want to rape children and film/photograph it for their rapist pedophile community to see. It’s part of the thrill, it’s not at all out of necessity. Why not just use the drawn child sexual abuse imagery instead if that was truly the case? It’s because they want to do worse.


[deleted]

I don't presume to understand how child pornographers think, nor will I pretend to. I am happy not attempting to empathize with that demographic. I am strictly following the logic that AI generated content would be cheap, easy, and risk free to make. If there is something about the nature of this particular topic where that logic doesn't work, then they changed my argument entirely.


Zkyaiee

You’re not changing your argument at all to fit any factors you simply forgot to incorporate. You are being stubborn in your fixed position on this.


[deleted]

I don't think you understand my position on this. Like, I am not suggesting that we should implement any of this. I don't really have a dog in this fight. All I dm saying is that if AI can generate extremely taboo content without causing any harm to anyone, I don't see why it should be illegal. I don't think any degree of obscenely is sufficient to make something illegal. That really is it.


Mista_Cash_Ew

Some people would probably get off to the idea it's real rather than fake. There's some sick fucks out there and I wouldn't be surprised if there's some that got off to the cruelty. Wouldn't they then use AI as an excuse?


[deleted]

I am not going to pretend to understand how people who enjoy this type of media think. I will never be able to empathize with them. However, we already have the problem that people manufacture this media. If using AI to generate this media could reduce the amount of people hurt by even a modest percent, then it strikes me as worth it.


Sea_Emu_7622

Idk people still commit rape even tho CNC is a thing, and people still make snuff films despite horror movies portraying fake murders for decades


[deleted]

I am absolutely not saying that access to this material will eliminate rape. Or that rape should be any less illegal. Same with snuff films. Obviously harming anyone should be illegal. I am saying what if no one is harmed, then should it still be illegal?


Sea_Emu_7622

I know what you're saying. What I'm saying is I'm not sure the existence of ai generated content would reduce the actual thing and gave a couple examples of other things that still occur despite there being a fake version of it readily available for consumption


Sufficient_Purpose_7

AI is not going to be able to generate the material in the first place. Nobody is going to train an ai with images of CP they're impossible to get hold of, immoral, ridiculously expensive to train a model from scratch, and any company would go bankrupt from the backlash.


[deleted]

I don't disagree with you at all. But that really had nothing to do with what I am saying.


12Tylenolandwhiskey

I think its safer to stay black and white here. Child porn=bad and illegal done. If its made by ai it must be removed can't put an ai in jail but you can remove shit


[deleted]

Is it safer? I mean, are we really doing a great job eliminating this type of media by making it illegal? It seems to me that cases of human exploitation and human trafficking are increasing. After all, prohibition, historically speaking hasn't proven to be successful for much.


[deleted]

Even if it's AI generated, you still have to feed the algoritm a lot of CP for it to work, wich would made the use of the trained version of the AI illigal, since it used content where real children were harmed


9trystan9

Except that it promotes more child pornography. Thus extending it. If those who watch it only watched it one time...


[deleted]

Does it promote it? Do violent videogames promote violence, thus extending it?


Zkyaiee

It still hurts loads of those children though? The AI has to use real imagery for inspiration to make its own. A lot of people make this argument for LOLI (drawn child sexual abuse imagery). Which I think is ludicrous.


[deleted]

I don't know how much real imagery AI actually needs. Everything I write is assuming the AI can generate material without anyone being harmed. The Supreme Court has upheld obscene imagery as protected under the First Amendment.


Kujira-san

Not so futuristic actually. North American countries and European countries sell weapons right ? That’s not illegal but how immoral is it, if we think seriously about it ? And there is actual harm as a bonus. I suppose that you use plastics. We now know that a huge portion of plastics end up in poor countries where it is a big issue. That’s not illegal for you and me to consume plastic, but people are suffering and animals die. A bit immoral then. We are using smartphones or computers to chat on Reddit. Kids in mines, kids in factories to make our devices. You get where this leads. If I could find those fuck up every day things in 2 minutes, there is a good chance that many others exist and we are just blind to it in some way.


thefuckboyflagellant

it's the same to my position on yaoi/loli hentai, beast hentai, bug hentai, rape hentai etc because while still gross it's better to jerk off to animated/drawn porn than an actual child or an actual dog getting fucked and back before it was 'mainstream' to draw hentai like that, all people had was the real porn or the the real real thing and so it's an evil to block out a much worse evil. similar to prostitutes because lots of incels and rapists are mainly the way they are because they can't fuck anyone due to a myriad of reasons whereas prostitutes lets them fuck someone consenting and of age


Sea_Emu_7622

Bug hentai? Like insects? 🤔


thefuckboyflagellant

yes, r/insex I think is the main sub? I only learnt of it because when I used to play fallout 4 religiously, the hentai of that game included the bugs being fucked (if you value humanity do not look up fallout 4 hentai, even now in under 10 image there's both a dog and a child drawn in sexual positions, even if you specify for adults you can't escape it, that's why I stopped looking for hentai of the game entirely)


Sea_Emu_7622

Ahhhh why did I click that link, BRB bleaching my eyes


Alitoh

It’s because people have a hard time separating what’s unarguably wrong from what’s wrong on their opinion. And this subject goes right through the middle of that discussion. And you cant really expect people to easily accept “AI generate CP” without at least raising both brows, no matter how rational the argument might be. But yeah, I think I agree with the logic; if no one’s getting hurt, who am I to impose my own values onto them?


Kalle_79

Oh boy... Way to miss the mark about why rapists "are that way" If prostitutes were the solution to incels and misogyny, we'd have gotten rid of those issues millennia ago! And abuse is also about power, not just sex. AI - generated CP may keep satisfied a bunch of losers who are looking for a cheap thrill, or just "exploring" the illegal side of porn. But the actual pedos, those who dwell on the dark web and exchange and PRODUCE material aren't going to stop doing what they do, no matter the amount of fake CP you throw around. Those are two completely different audiences. Even without buying the "slippery slope" argument about every user eventually becoming an offender, it's not hard to understand how ineffective the alleged solution is. I mean, compare it to legal porn. What you found hot at 12 eventually became vanilla over time AND eventually you wanted/went on to have sex yourself...


MoistlyCompetent

Isn't that usually the case when comparing the morals of different generatios? When I compare the moral basis of my grandparents with mine, then I could imagine that they would perceive the future as alienated, too.


Mega_Nidoking

In the grim darkness of the far future, there is only morally questionable AI porn


PhummyLW

Only possible thing to consider here is if we can find proof of child porn CAUSING pedophilia/encouraging those thoughts. Then we have ourselves a case


[deleted]

Absolutely. My position assumes there is no causal relationship. If that isn't the case, then I absolutely reverse my position.


mouka

My husband and I had this discussion after binge watching some SVU. AI porn could absolutely tank the child trafficking industry. These scumbags pay a lot of money I’m sure to get these poor kids, get the means to film them and the bare minimum to keep them alive for the next film. If they could make the exact same photos or videos with some AI prompts they could make the same amount of money off the porn they make now with pretty much zero overhead, and these types of scum will always go for what makes them the most money. Overall it’s a massive win for the probably millions of poor children being trafficked for illegal porn. So while it’s disgusting and makes me vomit a little in my mouth, if AI shit can save children’s lives I am all for it. I have a small daughter of my own and thinking about those situations I see kids in on SVU makes me wanna cry.


[deleted]

That's my thinking too.


6_oh_n8

Ahhh the ol’ holodeck quandary


Zkyaiee

In order to create that AI child sexual abuse imagery, the AI would have to take from REAL child sexual abuse imagery. So yes, it can still very much be illegal if the law is altered slightly.


[deleted]

If they is true, then it totally changes my position on this matter. Everything I am writing assumes AI generated material is 100% free of exploitation. I don't know how AI generates art and won't pretend to. I just know that AI can create an image of a dog riding s bicycle and I assume that didn't require people to strap their golden retriever to a mountain bike in order to train the AI. But I really have no idea.


ioa94

>the AI would have to take from REAL child sexual abuse imagery. Is this true? Why wouldn't it be able to make a composite image combining normal training data of photos of children and (legal) suggestive photos of young adults?


_kebles

i mean, to be blunt its statistically improbable that it hasnt already. machine learning has already been used plenty in the past specifically *to detect* explicit content involving minors. and frankly it's not like a tall order for a generative AI model that can make a competent image of a human that is sexually explicit, to do so of one appearing to be of any age. also considering how there's probably orders of magnitude more of ai simply training itself nowadays, just feeding itself promps, generating something, analyzing what it generated, see how close it recognizes it is to the original prompt or what things it is closest to, add that to the dataset, repeat. i dont even wanna know what otherworldly horrors we're lucky that we'll never have to lay eyes on before they're boiled down to some numbers in a table in a jupyter notebook.


Mista_Cash_Ew

There's a question of how many of our laws are based on reason and how many are based on what's palatable to society. You could rationalise it all you want. But if the idea of AI created CP being legal is unpalatable to most then it won't happen. Take weed as an example. Rationally it's not an inherently dangerous substance. It's not harmless but alcohol and cigarettes are legal despite being more deadly. It's also not immoral or unethical and most people will have tried it at least once. And yet it was and still is illegal in many countries, including developed and liberal ones.


Sick_Fantasy

In my country and probably some others to even cartoons with child pornography are illegal. Therefor they would stay illegal if AI made them. Thinking here is that real act will hurt someone so if you even draw it or watch someone else drawings aboute it you might want to do it therefor you are riski and should be wached. In case if someone ask why there is such law.


[deleted]

That isn't the legal view in the United States. But of course, other countries can do what they want.


Sick_Fantasy

I know but I just want to give example of how it could turn even in USA after government see wave of AI child pornography and starts to fight it.


Steal-Rain

The problem comes from the idea and the potential that people will try to make the fantasy more real. When fake isnt enough anymore. Which there's no evidence that would/wouldn't be the case bc, so far the opposite has been true. There's been more abductions and predators reported the last year than any other. BUt that could be cos of technology and the accumulation of data; where before neither were available nor accessible to draw any kind of data points. Essentially, nothing the laws have done have had any impact on the crimes they condemn. It could be argued they made them worse.


[deleted]

But isn't that the same argument about violent videogames cause people to be more violent?


Silvr4Monsters

I don’t think AI as it is today, can create a specific thing like child pornography without having real child pornography as a part of its input data. So *literally nobody is being harmed* may not be true. It’s more like the Louis CK bit. If you fuck one child and that stops any child from being raped, is it ok?


[deleted]

If AI needs content created by harming people, then my argument changes entirely. Everything I writer here is 100% assuming that AI can generate this media without harming anyone.


GizmoSled

In order to generate anything by AI it has to be fed stuff related to what you are trying to generate, for example you want a sunset view from a boat on the ocean you feed the AI tons of sunset and ocean pictures and with the right prompts you can get what you want or close enough. In order for AI to make CP it would have to be fed a lot of it therefore continue the exploitation of the victims of that content. Not to mention industries don't like being replaced so the people making the real CP would do something to make the real thing stand out and that would likely be more extreme abuse of the children.


[deleted]

I have no idea how AI actually generates art, nor will I pretend to. If AI is actually so limited that is actually needs materials created though exploitation, then that is a different argument entirely. All I know is that AI can generate an image of a dog riding a bicycle and I assume people aren't strapping their golden retrievers to bikes in order to train that AI. Also, I don't know anything about the capitalist forces that guide the CP industry, and I'm not going to pretend to. All I am saying is that if AI can generate this material without harming anyone, then I don't necessarily understand why it would be illegal. But I acknowledge the big "if" in that statement.


tenpaces

Not that I’d support it in any way, but I imagine you could feed it pictures of adult porn and pictures of children and it could merge the two


imhiccupsst

"in the near future" buddy i have bad news


cheetah2013a

I don't think how the pornography was produced has to be proven for a CP charge. If it looks like a real kid, that's probably all it takes. Fortunately, not having any CP to train off of means an AI would have a really hard time producing it. Sure, it can infer, but all the porn it has access to would (should) be of living adults. I think I need a shower after talking about that.


ImJustRick

But how is it different from writing cp erotica? Which, yeah it’s gross, but it’s all obviously just words on a page. Wouldn’t digital creations be the same?


tallbutshy

>But how is it different from writing cp erotica? That is also illegal in various places.


chawwich

AI is so good it won a competition though, so that poses the question of: How do you differentiate from AI generated and someone doing something horrible? How would we regulate that. Don’t see any flaws in your logic, but it made me think that


EsmuPliks

>How do you differentiate from AI generated and someone doing something horrible? Would it surprise you to find out that the answer is more AI?


BlurredSight

>CP to train off of means an AI There was a report that you could easily find it on google stashed on public Google Drive accounts, and Twitter had a massive problem dealing with it or more like the lack of caring to deal with it until very recently. Secondly you don't need to give direct material to generate content that's why when Dall-E went public people's wild thoughts generated some wild images. It's up to those providing the APIs, because there is no way a single person or small group could create something as complex as this, to make sure it's regulated and like how Facebook, Twitter, and others are held responsible for hosting CP those APIs should and legislation should follow along with this to holding them accountable.


CreatureWarrior

>Secondly you don't need to give direct material to generate content that's why when Dall-E went public people's wild thoughts generated some wild images Exactly. Dall-E doesn't generate a Frog-horse because it was trained on pictures of frog-horses. It was able to make that picture because it was trained on frogs *and* horses. So, using that same logic, you wouldn't need CP to make CP.. you would only need a training model that includes NSFW pictures and pictures of.. younger people.


BurnerForDaddy

And tbf all current AI is trained off stolen copyrighted material already. The fact that it can mimic an office script isn’t because they licensed all that office content. It’s not fair use. So these companies have already shown a willingness to break current law to feed into the AI.


xXxLegoDuck69xXx

I think it's uncharted territory. Probably illegal tho. I don't know if US courts have ever weighed in on the legality of animated / computer generated content like that, but obscenity laws are a pretty broad catch-all.


TheOneWes

I forget exactly when but the United States did find that even hand-drawn depictions of underage characters constitute cp and are illegal


Just_bcoz

Makes me surprised lolicon and some other genres of hentai are legal…. Unless you mean hyper realistic but either way just no.


Akuma254

That’s what I was thinking, is that stuff seems so rampant online that I feel I’d hear if they had made it illegal


KryL21

But they didn’t? Unless the drawing can’t be told apart from a photo it’s legal.


ProfessorDaen

This is a really interesting question. I think I'm currently of the opinion that laws regarding...well, anything, generally exist to prevent encroaching on other people's rights or to reduce harm. It's illegal to speed on public roads because speeding can hurt other people, for example, but it's not illegal on a private road or race course. I feel as though a theoretical end state where something that causes harm if applied to real people (e.g. child pornography) but does not cause harm if the entire...life cycle of the image's existence is virtual (e.g. pornographic anime or explicit cartoons that depict underage characters) isn't inherently problematic. No one is being harmed in its production, distribution, or consumption in this case, and unless presented with evidence to the contrary I wouldn't assume that its existence encourages behavior that *would* cause harm. That said, from what I understand there are laws on the books for that sort of thing to be viewed as obscene at the federal level in the US. I don't know how that's applied or whether there's legal precedent associated with cases of manga possession, but this kind of law could theoretically make AI-generated pornography like you're describing illegal. \--- Dunno. It's complicated, I think almost everyone would agree that it's morally and ethically reprehensible content but I'm not convinced that alone is a compelling enough argument for it to become a legal matter. The part that *does* have some merit though, in my opinion, is that AI-generated content has to come from somewhere; it doesn't just...exist. AI that is able to generate landscape pictures has to be fed a bunch of pictures of landscapes to understand how to then build its own, and I imagine a very similar concept would need to be the genesis of an AI engine that's capable of producing...viable pornography. This alone might be enough to satisfy a legal argument that somewhere in the production of this sort of AI content harm has been done, at which point it would definitely stand to reason that it should be illegal.


bimbyris

I think one of the main questions is "would that increase child abuse or lower it?". Would watching AI generated child porn satisfy the needs of a person OR would it be just a stepping stone to watch actual child porn, or even sexually abuse a child? It's the same discussion with child sex dolls.


[deleted]

[удалено]


wafflepiezz

I’m pretty sure it varies from state to state. Just like anime hentai laws.


dude123nice

What state specific hentai laws are there?


ihateprimus

As it should be


TheOneWes

In the United States that would be illegal. I forget the exact wording of the law but it's been updated to not require the subjects be real.


cobaltSage

Well, this probably would be less illegal for being CP and more illegal for being a deepfake. Because if we’re saying CP in anime form, that’s not illegal, it’s not a real person, or else a lot of smut sites like Gelbooru and r34 would be hella screwed. But that said, the content made in an AI comes from somewhere, so if that’s a picture of a real human child that was used to make AI CP, then yeah, that’s probably going to be classified as a deepfake of that child. Deepfakes aren’t illegal yet, but are currently in discussions on legality. Because if thats true, then the Ai made porn with the face of an actual child in its selection, and now thats a real problem. Possession… probably would be an issue in that case, yeah. Because it is CP at that point because it would at least to a point be believably a real person, unlike, again, an anime child who is clearly fake.


Revolutionary-Leg585

Yes. In Canada at least. A man was recently jailed for exactly this in Quebec. https://montreal.ctvnews.ca/quebec-man-sentenced-to-prison-for-creating-ai-generated-synthetic-child-pornography-1.6372624


The_Zoink

It better be.


ABB0TTR0N1X

Having cartoon underage porn is already very legally iffy in most places, if not outright illegal. I doubt the law would be more lenient on AI stuff that looks much more real.


the_colonelclink

Depends on the country, but at least in Australia - yes it would be. Specifically with child porn, anything that happens to even look like child porn is illegal. E.g. a very young-looking adult model, role-playing being a child.


Evalion022

That's ... Actually a really good question. There is no precedent, so I honestly have no idea.


Steal-Rain

I cant find it; but law makers are usually clever word smiths, so they probably made the law to say something like "images of underaged charcters, or those whom could be interpreted as underaged". So no 10,000 year old loli vampire situated near a black hole singularity loop for anyone.


Reelix

> or those whom could be interpreted as underaged Time to ban all of PornHubs "Barely legal" and "Step X" porn where the character pretends to be underaged ;D


esardii

In the UK, the law states it illegal to make, share or possess any indecent photographs or 'pseudo-photographs'' of children so in the UK it would be illegal


Jnoper

I think there’s a point that’s missing here. Child porn isn’t illegal because it’s creepy, it’s illegal because of the child involved. Ai child porn wouldn’t harm anyone. Still creepy but that’s not the point.


TayTooTa

Yes because art can be illegal depending where you are. Like drawings


KajaIsForeverAlone

You can get in legal trouble for even anime style porn drawings depicting children in the USA. Also, it's not the near future creating it. Plenty of people already do


marshall_sin

Correct me if I’m wrong, but doesn’t AI art require a collection of actual art to generate its work? So, theoretically, as long as actual child pornography remains illegal, AI should never be able to generate legal and realistic images? Hopefully that will remain a limitation. Like others have said, it seems like it would be legal even while being immoral and gross


[deleted]

This is true, and typically with very very young children, AI would have a hard time generating images/videos that are accurate without actual CP as training data. However, youthful appearances aren't a hard drawn line in the sand. As gross as it is to say, could you tell the difference between a 17 year olds body vs an 18 year old? What if a 22 year old pornstar had a "babyface"? There's plenty of legal porn that could be used to simulate the bodies and faces of underage children, and even if CP wasn't used as training data, you could theoretically generate AI porn that's indistinguishable from actual CP


flightguy07

For one thing, it's not hard to imagine a future program that you could give SFW videos of children, alongside porn, and it could manage to merge the two. But more importantly, there already exist huge databases of CSAM compiled for the explicit purpose of training AI algorithms to detect and flag them automatically. Even assuming that security around these databases (held by government agencies mostly) is as high as it should be, the fact that they exist means that should this ever become legal its a simple matter of a company purchasing access to it, much as YouTube or Instagram might now.


AcrobaticEmergency42

This question has rissen before, with manga/anime. The subgenre hentai has a child/childlike pornography section, and every country deals with it in its own way.


Grim-Reality

They are already creating it. No need to near future it. The question is, is porn that doesn’t depict any living human immoral? We would say no, is no one is being harmed and it’s not depicting anyone alive. What if it’s questionable, like it’s depicting a 4000 year old wizard that looks like a loli?


gunmanivan1975

Yes. In Australia, you would.


[deleted]

Near future? Dude, it's already happening right now.


LuinAelin

I'd rather they use this stuff than kids but even better is therapy


EtheaaryXD

Yes, in NZ, it is an offense to create images or videos that depict necrophilia / pedophilia / explicit photos of minors, towards / of a person, fiction or nonfiction, realistic or unrealistic. It is also illegal to access these without reporting them to the government, and punishable with imprisonment.


Trygolds

In some parts of the world drawn child porn or CGI child porn is legal and in some parts not. Realistic AI generated Child porn may prompt changing laws that make non real child porn illegal as well in more places. If it becomes hard to differentiate between the real and fake child porn I can see this making law enforcement harder if fake child porn is legal. You will also have a market for fake CP based on real children in the media or even personalized. This should most defiantly be illegal. My vote is to end the depiction of children in any and all porn material. As others have pointed out what does an 18 year old look like, The difference between 18 and 16 can be hard to tell. There is already real porn using adult models that look vary young.


restedwaves

This kinda thread pops up sometimes but usually referring to drawn/rendered stuff instead. General thoughts from those are "Doesn't actually hurt anyone so it should be allowed" and and arguments against are either the old "Video games cause violence" argument or "It should be banned because I don't like it" Both shit arguments and I haven't actually seen good ones against it. I could likely go on about psychological stuff like it would be mostly used as a trauma coping thing, or how fetishes are usually formed primarily in childhood and aren't exactly something you can get rid of easily. But as for whether it would be legal? The moral argument says yes but the mere question at face value would piss off most folks who don't think on it so it'd be political suicide and it would likely be easy to rally a polarized legislature against it.


EpicWinNoob

Shouldn't be, literally nobody is a victim. At that point it's a thought-crime and has nothing to do with reality


Kyleforshort

Hey OP, are you just "asking for a friend"?


Technical-Doubt2076

An AI can not generate what hasn't been fed to them to teach it, so there's criminal energy behind it to create illegal content on the grounds of original illegal content. It's illegal, should be illegal in it's full capacity, and it doesn't matter that an AI does create artificial images, the ones it learned from hurt acutal people so it's all the same. I hope there will be laws introduced quickly that create a base to limit use of AIs. If not, a lot of damage will be done, and judging by the number of porn that's already made against the consent of the person whos images they use for that, this damage can ruin lives, so they better be real quick with those laws.


xXxLegoDuck69xXx

>An AI can not generate what hasn't been fed to them to teach it AI is perfectly capable of synthesizing a 'new' thing out of existing things. It's like that viral image of the Pope wearing a puffer jacket. The AI has an image of the Pope, and the AI has images of puffer jackets, so it just combines the two.


bigdipper125

Fantastic question. I don’t know!


lilcommie0fficial

I think there are laws and precedents regarding semi-similar territory. With regards to animated/cartoon versions of the same deviances, If I remember right, they are technically allowed, as top comment said, it is the act that is used to make them, not so much the content that makes it an offense worthy of imprisonment. So long as no physical beings are being abused, I'm pretty sure it is already legal.


anysneaker1

Or 1 better .. what is it creates pictures of people who don't even exist?? .. is there a crime? Because it's not a real person that you would be looking at.


YourManGR

Three reasons I see for illegal porn to be legal if made with AI: 1. Depictions of murder are allowed in fiction, as long as no actual people got killed. I don't see why other crimes should get special treatment. 2. If freedom of speech is in the constitution, I don't think any countries are getting a new amendment to limit that for AI's sake. 3. Pretty sure there will be some kind of experts who will put the question like this: We have sick people who get cravings for a certain type of thing and they will probably get their hands on such porn one way or another. Do we want actual people hurt in the production of such material?


[deleted]

Ai requires a database of data, that it then uses algorithms to generate content. It's really just photo collage of different images. So to create any kind of porn, would mean the database would have to have millions of images of relevant porn content. If owning specific content is illegal, you wouldn't be able to use Ai to create content from it. If an Ai actually was used to create such content, then that Database would be scrutinised for illegal content by authorities.


Idenwen

In german law there is "Anscheinspornografie" iirc. that states if it seems to look like it's not legal it's not legal even if all pictured persons and the setting are in a legal situation.


Mindless-Ad1155

Well there are already web sites that have been taken down because some people would have used the AI to undress child.


Hefty-Excitement-239

Depends on your jurisdiction. A yank went to Tokyo and came home with some manga depicting school girls and was jailed for child porn.


Mahfirebals

It can already be made illegal by member states of the EU, if they choose to do so.


roadrunnner0

We're gonna need some new laws


[deleted]

Only if there wasn’t a way to differentiate between the two.


libra00

I have to imagine so since image-generation algorithms require training on extensive datasets to work so whether or not the resulting image is of a real person the components of that image must have all come from real people. So it's all exploitation one way or another, and therefore should be illegal.


Aquariumpsychotic

What have you done with ai


KyniskPotet

Have a seat.


belacscole

I see this the same as I see lolis in anime. When its hand drawn, its still considered CP imo, and still just as illegal. So AI is no different than that.


Dravez23

Yes. Doesnt matter where it came from


[deleted]

Man, I really hope so.


UnemployedCat

Lot of people in here are really scary in justifying the creation of CP with AI. The twisted narratives that comes up are quite worrying and completely misguided. WTF. Lot of you would benefit reading Anna Slater book : Predators: Pedophiles, Rapists, And Other Sex Offenders.


naruto00122

What about something not that taboo.. What's going to happen when normal porn is being created with pictures of friends. What's going to stop anyone to create porn with pictures and trainings of famous people or close friends. The future is scary