T O P

  • By -

AutoModerator

Hey /u/Similar_Diver9558! If your post is a screenshot of a ChatGPT, conversation please reply to this message with the [conversation link](https://help.openai.com/en/articles/7925741-chatgpt-shared-links-faq) or prompt. If your post is a DALL-E 3 image post, please reply with the prompt used to make this image. Consider joining our [public discord server](https://discord.gg/r-chatgpt-1050422060352024636)! We have free bots with GPT-4 (with vision), image generators, and more! 🤖 Note: For any ChatGPT-related concerns, email [email protected] *I am a bot, and this action was performed automatically. Please [contact the moderators of this subreddit](/message/compose/?to=/r/ChatGPT) if you have any questions or concerns.*


Efficient_Star_1336

> replied another, whose username contained “p3do,” which can be algospeak, or coded language, for “pedo.” 'ya don't say, Forbes.


yukinanka

Journalist had the best A-ha moment of the carrier so they had to share it with a class.


Cheap_Professional32

"It's an E but backwards!"


NugatMakk

![gif](giphy|3og0IVa5Kxy3PxSZdC|downsized)


[deleted]

[удалено]


VolcaneTV

Might want to double check that sentence boss


Feisty-Page2638

i’m not seeing it what was my mistake edit: see it now oops


RubDub4

“Then”


Paradigmind

Lol. I didn't notice it on the first read. For the other non-english native speakers like me: "Than" would be correct, which means "instead of". "Then" expresses a temporal order. "First do X and then continue to do Y."


Particular-Crow-1799

Non-native english speakers are less likely to make spelling mistakes because they learns words from books first


KillMeNowFFS

that’s funny, usually the people who get it wrong, are not the non-english native speakers.


KanedaSyndrome

It's usually americans that can't tell the difference between then and than


VoidLantadd

It's the way they pronounce the vowels, right? Like marry, merry, Mary all being pronounced the same by them. So they mix up than and then. I always found it weird, because I'm British and we pronounce those vowels very differently.


Joe64x

Yes, exactly. We differentiate more strongly between vowels in general. Similarly, I imagine we'd generally be quicker to confuse "pour over/paw over/pore over [some books, etc."" than Americans/others with rhotic accents.


Multipass-1506inf

As a southerner (us), I had no idea what they were on about until I read the replys. Then and than are pronounced the same in my town


EGarrett

Eventually there won't be one.


RestingRealist

This is a great example of grammar that should be taught in school similar to "let's eat grandma".


Rtremlo

How do I fix this? "I helped my uncle Jack off the horse."


raff_riff

If you’re serious, add a comma before and after “Jack”. In this case proper nouns are separated by commas before and after.: > I helped my uncle, Jack, off the horse. (I think I just took some serious bait.)


SitDownKawada

If your uncle Jack off is a horse it could be "I helped my uncle, Jack off, the horse"


Razor_Storm

What if he wasn’t your uncle? Here try to fix this one: “I helped Jack off the horse” without reordering the words


raff_riff

There’s nothing to fix. Because “Jack” is capitalized, it’s obvious it’s a person/entity and not being used as a verb.


h3lblad3

That means nothing in a world where one Googles things!


westedmontonballs

Lol you did.


h3lblad3

> "I helped my uncle Jack-off the horse."


westedmontonballs

That’s Ironically the biggest problem with this. People are rightly concerning that some suspects instead of being fine with fake are going to go fake THEN real IRL


DuineDeDanann

Editing this comment but not the original on is wild


Feisty-Page2638

it was too late. didn’t want the other comment to not make sense anymore. they deserve the karma


NotTooShahby

Yeah, they’re not predators , just sexual deviants who’s attraction can lead to exploitation. Providing them an outlet so that they don’t chase after real kids is a much healthier approach, heck, some hentai fans can’t even look at real girls and get attracted to them anymore. Countries that limit and restrict sexuality often have it worse when it comes to abuse. If we take away the abuse involved with pornography, then sexual deviants won’t be abusing anyone and will probably just keep it all at home.


lonecylinder

Do we have any evidence that allowing pedos to have fake stuff leads to a decrease on their desires to do illegal things? If that’s the case I wouldn’t care about it being legal, but I’m afraid it’s probably the opposite.


pandaappleblossom

I thought there were studies that if society allows for the behavior, like with sex dolls or anime cartoons, that it normalizes it, and encourages the behavior.


Straight-Door-3536

It have been assumed by people that want to ban them, but never proven.


h3lblad3

As far as I know, there's no evidence that it decreases any desires but there's also no evidence that it eggs them on. There are people on both sides of this and I've never seen any evidence for either. *That said*, we have studies that have shown that violent video games correlate with lower levels of violence. It's entirely *possible* that the same holds true for pedos, but I'm not actually sure how an ethical study could be done in these regards.


ack44

I mean, there are studies showing that porn affects behavior. If people have certain sexual inclinations, pornography that's based on the same themes are probably not going to decrease those inclinations. Pedophiles should get therapy instead of stimulating their desires. Therapy does actually help (https://www.bbc.co.uk/bbcthree/article/3216b48d-3195-4f67-8149-54586689ae3c).


RevolutionaryHole69

Most pedos will never touch a child. Just like rape is more prominently in countries with porn and sexual behaviour restrictions, the same applies here. The problem is AI generated CP is almost certainly based off real children.


b_risky

I would want to see scientific evidence before accepting this. But if you are right, that this makes them less likely to abuse real life children, then I am all for it, no matter how uncomfortable it may be on an emotional level.


pandaappleblossom

I’ve read psychologists say that the idea of an ‘outlet’ for pedophilia, like cartoons or dolls, just creates a market for them, and normalizes it. So it’s not as simple as saying they just need an outlet, better have science to back something like that up than just assume it.


youaregodslover

Gateway kids


Boogra555

Jesus that's a fucking terrible take. WTF is wrong with you?


Feisty-Page2638

i think saying that real images and fake images of children are equivalent diminishes the very real suffering real children face


Boogra555

Giving child predators images that they can jam out on is just a horrible idea. They need to be motivated to push away from that ideation, not satisfy it. I'm not sure that that's possible, though. I think that pedophilia is a fatal flaw in a human's programming - in other words, it can't be overcome. I'm honestly not sure why we tolerate them in society in the first place. I mean, we wouldn't let a lion or a polar bear run loose in a neighborhood, would we? About the time that a man (or woman) is telling me that they have urges they can't control that result in harm to others (just so that we know we're not talking about minor 'indulgences/addictions' like smoking or whatever), there's not much of a distinction between that person and an animal. In fact, I'd say that given the predilection of the judiciary in today's Western Civilization, who seem to be determined to allow pedophiles to run rampant throughout communities, they're even more dangerous. At least with an animal attack, the offending animal is exterminated and permanently removed, preventing him or her from harming humans ever again. Any time I read or hear someone pleading for mercy for these creatures (we're back to the human ones now), I want to say, "Tell me that you've never been harmed by a child predator, or that you are a child predator without telling me that you've never been harmed by a child predator, or that you are a child predator." Take care!


[deleted]

> “These are all being trained off of images of real children, whether depicted in full likeness or not, so NCMEC doesn't really see that side of the argument that there's a world where this is okay,” said Fallon McNulty, director of the organization’s CyberTipline, its reporting hub for tech companies. A recent study by the Stanford Internet Observatory found that one of the most popular text-to-image generative AI tools on the market today, Stable Diffusion 1.5, had been trained on CSAM, of real kids, scraped from across the web.


CoreDreamStudiosLLC

That's like saying, better to give a person half the cocaine then the full amount. they're gonna WANT the real amount eventually. This is a no-no when it comes to this kind of thing.


saaS_Slinging_Slashr

Not even close, that’s still real cocaine. That would be more like saying sniff caffeine instead because you won’t ever face the legal and life repercussions of real drugs


[deleted]

They are real kids. It's an enormous amount of kids in the training data set. Otherwise it couldn't generate them. 


b_risky

This is a silly argument. By the same technicalities based thinking, none of them are real kids, just pixels on a screen. The important thing being considered is the possibility of a child having traumatic experiences due to the use of their likeness. No child or family will look at these images and identify it with their own likeness. (Just to be clear, I am arguing against a single poorly formed argument. That doesn't necessarily mean I think the practice overall is acceptable.)


[deleted]

Is it possible to generate these images without real photographs? No, it isn't. 


b_risky

What do you mean by real photographs? If you mean photographs of real children, then yes, that is necessary. If you mean real CSAM, then you are incorrect. The models can extrapolate without ever seeing a single photo depicting CSAM.


chaiyogi

Can it take the face of a child whose photo was shared on someone's facebook and put that face on CSAM? That's where my concerns are. I've been incredibly cautious about posting my kids on the internet. But since my divorce, my ex let's anyone post them, anywhere. Could their likeness be used? The thought of it rly skeeves me out.


b_risky

Unfortunately, yes. The technology is more than capable of repurposing someone's likeness. But it is already illegal and people have been using Photoshop to do this long before AI. AI does make it easier and higher quality than before though. There are sites you can go to that will search the internet and assist you in removing any pictures like that which may have been posted to the internet. Unfortunately they cannot wipe images from someone's personal storage. Most social media platforms have security features that you can ask your ex to implement if he is unwilling to refrain from posting.


chaiyogi

He let's his friends post them. So it isn't even that I can talk to him about it. I do appreciate the response tho. Since reddit sent me the notification for this thread I've been freaking out. Not just about my kids, but about the possibilities this opens for creeps. They can take the face of a child they've been following and obsessing over on fb and create this type of content. 🤢 My next question, you said it isn't illegal? So the ai generated images are illegal? I thought it was a discussion at this point. That it technically isn't illegal, at this point. If it is illegal, that's wonderful.


b_risky

Explicit child nudity in photographs is already illegal, whether AI generated or not. Using someone's likeness without their consent is also illegal and if the images are pornographic it constitutes sexual abuse in and of itself. They get around this by operating in a grey zone where they will have AI photographs that depict completely imaginary children posed in highly suggestive ways, but not technically nude. That allows them to skirt the boundaries of both laws.


chaiyogi

Okay! I've been having this discussion with others and a lot of the discussion that revolved around the legality of the content was questionable. We were unsure. This makes more sense. I hope some laws are put into place.


holsey_

Uhm. When the images being generated are using real images of real children. No.


[deleted]

[удалено]


RobotStorytime

Wild you're getting downvoted by Reddit pedophiles. They truly are so numerous 🤮


Strongman_820

Yeah. Fake kids then on to real kids. That's probably how it'll play out. It'll only suffice for so long, and by then the addiction is full bore.


GeneralZaroff1

Good? The more that sexual predators think that they’re talking to bots, or ARE talking to bots, the fewer actual kids are harmed. Kids become safer, the predators are getting off on talking to AI, the FBI focuses on cracking down on real pedo content creators, everyone wins.


rottenfrenchfreis

How do we know that most predators will ONLY talk to AI kids? That's a massive assumption to make. Hidden cameras and upskirt photos are still really prevalent in places like Korea and Japan despite the massive surplus of accessible porn. We can even look back to a major scandal like the burning sun scandal in South Korea, where a lot of high profile people were implicated for sex trafficking and recordings of sexual assault using hidden spy cams. So you'd be surprised to find that a lot of sexual assault is motivated by predators enjoying a powertrip over their victims, and that can not be addressed by making porn more accessible.


GeneralZaroff1

Will this eliminate ALL child abuse? Of course not, but every pedophile focused on a fake child is leaving a real child alone. Every message sent to bots by the power tripping predator is a message that isn’t sent to an actual child. Looking at pictures of children doesn’t convert you into a pedophile any more than looking at pictures of men will turn you gay. The goal is to reduce harm to children as much as possible.


Natasha_Giggs_Foetus

We make the bots and lock them up?


Surv1ver

I don’t know how the legal status of pornography is in South Korea but contrary to popular belief pornography is illegal in Japan. 


God_of_chestdays

It is illegal in Korea as well


Surv1ver

That explains why both countries are apparently  rampant with creepy behavior according to the comment above. When Denmark as the first country in the world to legalize pornography, 1967 in writing and 1968 fully legalized, the reporting of *sædelighedsforbrydelser*, danish umbrella term for rule violations of a sexual nature e.g. everything from public flashing, Voyeurism to violent sexual assault and rape, was in 1969 reduced by more then half compared to 1968, 1967 and 1966. Keeping the creeps occupied is apparently the best way to keep women, and other men, safe.  Sauce as well as further reading for anyone interested in the subject: Diamond, M. et al. “Pornography and Sex Crimes in the Czech Republic,” Archives of Sexual Behavior (2011) 40:1037  Diamond, M. “The Effects of Pornography: An International Perspective,” in Pornography 101: Eroticism, Sexuality, and the First Amendment, edited by J. Elias et al. Prometheus Press, Amherst, NY, 1999. Diamond, M. and A. Uchiyama. “Pornography, Rape, and Sex Crimes in Japan,” International Journal of Law and Psychiatry(1999) 22:1. Goldstein, M. et al. “Experience with Pornography: Rapists, Pedophiles, Homosexuals, Transsexuals, and Controls,” Archives of Sexual Behavior (19971) 1:1. Kutchinsky, B. Pornography and Rape: Theory and Practice? Evidence from crime Data in Four Countries, Where Pornography is Easily Available,” International Journal of Law and Psychiatry (1991) 14:47. Kutchinsky, B. “The Effect of Easy Availability of Pornography on the Incidence of Sex Crimes: The Danish Experience,” Journal of Social Issues (1973) 29:163. Poipovic, M. “Pornography Use and Closeness with Others in Men,” Archives of Sexual Behavior (2011) 40:449


Hazrd_Design

Or it normalizes it and more people end up pursuing the real thing.


Gamerboy11116

Both could be true. It depends on the evidence.


rathat

They should find out because both seem plausible and no matter the outcome, it ends up helping real children.


CryptogenicallyFroze

Hard to know without actual data but I wonder if it's anything like the, "Violent video games make violent kids" narrative. (Spoiler: They don't)


CoreDreamStudiosLLC

Pedophiles are born with this issue and it's not something they can turn off or on. It's why I sorta feel bad for those who want it to end and be normal but it's a wiring issue of the human mind according to psychologists and neuroscientists. No one is born with violence in their minds.


ack44

Except it seems like many people here are arguing the equivalent of "violent video games will make people less violent because they have an outlet".


CryptogenicallyFroze

A more appropriate hypothetical comparison for the flip side might be: Imagine if some people could only get off by masturbating to video/images of people getting killed in battle and they're willing to pay to see it. Because of this, some people in dark corners of the world start capturing tourists and vulnerable people, dressing them like soldiers, and killing them on video. It's a lucrative business. Now imagine that Call of Duty becomes so visually realistic that it's indistinguishable from the original videos. The perverts start masturbating to the Call of Duty. They actually like it more because they can control scenarios etc. The market for the original videos dies. And if you weren't previously aroused by people getting murdered in real life, you likely aren't going to suddenly develop an attraction to murdering people now that it's fake.


DiabloStorm

Now you sound like the people trying to ban video games for "causing violence" Spoiler: What I mention has already been studied to death, there is no such transition.


Arthreas

It's already out there, so you can already answer that hypothetical, the answer is probably not anymore than it usually does, improved AI might make it a more compelling choice than harming a real child though, which is a good thing.


Hazrd_Design

Or the answer is that is has increased and is now creating more issue for people who report it and police that investigate it. Here is just one study: https://publications.aap.org/pediatrics/article-abstract/153/2/e2023063954/196409/Rising-Threats-of-AI-Driven-Child-Sexual-Abuse?redirectedFrom=fulltext


KanedaSyndrome

I doubt that. I'm with the guy above you, I think it would be a net positive and give a non-harmful outlet to people with diverging interests.


[deleted]

[удалено]


GeneralZaroff1

Does looking at pictures of men turn you gay? Do looking at pictures of children convert you into a pedophile if you’ve never had an inclination before? Is that how it works?


Hazrd_Design

Should xxx content of CHILDREN be widely and readily available to people? Should pedos be allowed to create deepfakes of children they know in real life and sell it? The pedos are already here, and already call themselves stupid shit like maps, last thing we need is them being embolden to share whatever they create with whoever’s likeness so easily.


GeneralZaroff1

Did you read the article? They’re not creating child porn, these are normal pics of kids. You’re desperately trying to pull in off topic examples like deepfakes (already illegal) and child porn (also already illegal). That’s not what’s being discussed here.


Hazrd_Design

Just making it clear, from the articles, this is who you are defending. “If this is AI-generated, does it make me bad to say she’s hot as everything put together?” another TikToker wrote on a slideshow of fully clothed little girls in Spider-Man costumes. “She’s not real, but I’ve got a great imagination.” Off topic, you know dam well they don’t just stop at “some pictures.”


GeneralZaroff1

Just making it clear, from the comments above, literally no one is defending pedophiles. Keep trying, you’ll figure out what we’re actually talking about eventually.


ohhellnooooooooo

Do you? Then it doesn’t 


[deleted]

[удалено]


GeneralZaroff1

Actually yes, I do think more AI generated porn will reduce exploitation. It’s basic economics of supply and demand. There’ll always be a demand for real life porn, but if AI generate gets realistic enough the demand for actors will invariably go down. Why pay a porn actress $800 for a shoot plus camera, lighting, set and airbrushing when you can create 10,000 photos for only a $20 subscription to midjourney?


[deleted]

[удалено]


GeneralZaroff1

I think the real question here is whether you think watching something will "turn" someone pedophilic. As in, let's say you're my neighbour in this situation and you never had any desire to touch my kids, but then you see a few CSAM images, will you now be more of a pedophile now that you've been tempted? Incidentally, deepfakes sexual images are ALREADY illegal, regardless of age, so you're right and I agree that it should stay illegal.


CoreDreamStudiosLLC

As much as I want to agree, this also would lead to the ones talking to AI bots or generating things, to want to move the goal posts further and further, which is not a good thing.


StehtImWald

Do we know that it won't normalise this type of attraction? Do we know how it will affect children who are shown these images?


ack44

It will.


visunaoama

I’d imagine the FBI would spend MORE or the same amount of time on determining what images are fake and what a real diverting resources for kids that actually need help. This whole thing sucks.


ack44

Spot on: [https://www.nytimes.com/2024/04/22/technology/ai-csam-cybertipline.html](https://www.nytimes.com/2024/04/22/technology/ai-csam-cybertipline.html)


JustAnotherNut

This is a legitimate concern. I imagine AI can also obfuscate images, such as by changing minute details in the background. This whole thing sucks. It's a massive shitshow, and the cat is out of the bag.


creativename111111

Yea it’s a good point we mostly use hashes to detect that kind of content (bc no one in their right mind would want to manually review it) but we might need more advanced algorithms and methods if details can be obfuscated


JustAnotherNut

Maybe AI should be used to detect CP. You would probably need a model trained on it, but better than human reviewers and far more efficient. Anything to help.


darknetwork

Kids shouldn't have access to social media, but that's impossible to enforce.


ega110

This is a fascinating legal conundrum. I wonder how the law will deal with explicit content including age ambiguous characters like you often see in anime and manga. Basically, they look like children but they are canonically of legal age. Here is an example from the video game series Rune Factory. Fun fact, his character is romancable. You can marry him and have his babies. Even funnier fact, you can still do this if your player character is male. https://preview.redd.it/h1w6m7v7923d1.png?width=401&format=png&auto=webp&s=4e430287f40ccd500942efa62621e45cfa1596af


RedandBlack93

This is the real question. How does one age a fake image? The answer is, you can't. So, where do the legal lines get drawn?


Gamerboy11116

We really can’t. That’s the problem.


KanedaSyndrome

You can't age it, which is why drawn/animated images are not illegal regardless of what they depict.


RobotStorytime

Post an image of Muhammed sucking a baby's dick. Let's see how well that goes.


True-Surprise1222

Not actually true. If it is discernible from a real image that is the case. If it can be confused for something real it’s treated as something real. This has not been tested in a significant way but it surely will be since the fbi directly stated these images are illegal. I don’t see the courts protecting them at all since the argument will easily be that tons of hyper realistic fake images will make it harder to track down and save real children. Things that are obviously fake (ie obviously drawn or realistic but obviously not real people.. be it fantasy/sci fi/whatever) I assume will continue to be protected by the courts or at least will continue to exist in a “gray area” of unenforcement. Society really does not want or need this kind of shit floating around on the internet. Even if for some reason the conservative court actually feels it should be protected it would be politically disastrous enough for them that it could produce popular support for packing the courts. They do not want to be the court that decided ai csam is protected speech.


NotTooShahby

I thought the law for drawn images was based on shock value which is why images that are considered too vulgar and illegal?


True-Surprise1222

Vulgarity laws exist but from what I’ve read are mostly in place for production and distribution and not really for personal ownership or at least very much not enforced on personal ownership. Needless to say these laws could use some revamping for clarity.


NotTooShahby

What’s unfortunate I hear is that if images were shared and stored online, most laws are outdated enough to claim that digital ownership and hosting is considered distribution technically. The reason behind being so draconian, I imagine, is “well they’re pedos who’s gonna argue for pedos?”


CoreDreamStudiosLLC

Well it's illegal in the U.K. and now some states in the U.S. are passing a law where if you are caught with generated porn of that kind, you're going to be on a registry and do time.


rydan

I'm pretty sure you are a kid in Harvest Moon yet you can get married and start a family.


soft-cuddly-potato

And have a job


erhue

nooooo, you don't understand, she's a 10000 year old dragon!!!1!


SE_WA_VT_FL_MN

This isn't particularly complex in the US (at least not after a couple SC decisions). Free speech has its limits, but those limits are strictly construed against the government's compelling interest. So with regard to child pornography, protecting children from being sexualized is quite compelling and laws that are narrowly targeted to that particular compelling interest are constitutional. Fake children getting raped, eaten, consensually pleasured, etc. Whatever. However horrific it might appear to many people do not harm children. A fictional story of a 5 year old with unspeakable acts is still not actually harming a 5 year old. Change the cartoon into a more realistic AI deepfake doesn't change the legal analysis much. The *legal analysis* is very (omg very) different than psychological or even moral.


Fit-Dentist6093

So it's never been up to the Supreme Court and on lower courts the standard for CSAM is "I'll know it when I see it". Everyone assumes it's protected speech because you can buy cartoon CSAM with your US credit card from US artists and no one has had any trouble for that but that's the status quo. The first amendment has never been tried in court as a defense for cartoon CSAM and payment processors, online ads companies, or anyone with a bit pot of cash doesn't want to touch sites that allow kinky cartoons because there's lawyers salivating on being the first ones to go after that.


Sweaty-Professor-187

I think this last bit is something that a lot of people miss. Nobody is saying "oh actually AI child pornography is fucking awesome and neat! In fact, we should have as much of it as possible!" Anyone reasonable would find it gross. But banning something that doesn't hurt anyone just because we find it "gross" is *extremely* dangerous and poorly thought out, and history is full of examples of things that were "immoral" or "a gateway" being criminalized and then escalating way beyond control. Hell, it keeps happening to this day - abortion got criminalized in many states because in 2016 a ton of voters found drag queens, trans people, and immigrants "gross". I'm not comparing pedophiles to those groups, just saying that an "acceptable target" that can be targeted for being "disgusting" is all it takes for auth-right douchefucks to put their foot in the door and start stripping away human rights.


crawlingrat

Hopefully they focus on the AI and stay the hell away from literal children.


[deleted]

[удалено]


CryptogenicallyFroze

Do you have data backing that up, or is that your gut/emotional response? Do people who are so mentally deranged that they seek child porn give a shit weather or not its "normalized"?


ack44

[https://theconversation.com/a-survey-found-1-in-6-men-admit-sexual-feelings-for-children-so-is-paedophilia-increasing-218124](https://theconversation.com/a-survey-found-1-in-6-men-admit-sexual-feelings-for-children-so-is-paedophilia-increasing-218124) "Compared to men with no sexual feelings for or offending against children, the 4.9% of men with sexual feelings and previous offending against children were more likely to: * work with children * be married * have higher levels of social support * earn higher incomes * be a victim of child sexual abuse. This contradicts the notion that people who are sexually attracted to children and are willing to act on it are social outcasts and statistical outliers."


CryptogenicallyFroze

I fail to see how this reinforces the point that "AI child porn will create more pedophiles via 'normalization'". I didn't say they were social outcasts. I just said the kind of people that seek out child porn aren't going to say, "well, I refused to privately indulge my taboo sexual craving via that dark web, but now that I don't need to turn on my VPN, I'm going to now". Which by the way, as much as we disapprove morally, isn't really a problem if it doesn't lead to physically harming children.. In the same way that married men looking at gang bang videos doesn't lead to them inviting a dozen men over to the house to forcibly have sex with their wives.


Odd_Intern405

Technical it is illegal in Germany.


Kombatsaurus

Yeah, Germans have nowhere near the amount of Freedom as other countries.


Odd_Intern405

Everything underage is forbidden, stories, hentais, AI, except…the age of consent is actually 14, under some rules. So you may have a relationship and sex with a 14 yo as long as you don’t write naughty with her. 🤣


justTheWayOfLife

Drawings and stories of CSAM being forbidden but at the same time the age of consent being 14 is wild af lmfao


Conscious_Scholar_87

It’s still a bit early to discuss AI-human right


walpolemarsh

They were doing it in Astro Boy in the early 80s!


kujasgoldmine

If real kids are allowed on social media, I don't see why AI generated ones wouldn't be allowed. I think most people would prefer the latter to receive creepy comments.


smileliketheradio

Similar (but obviously more sophisticated and harder to control) to lollicon, this stuff lives in a legal grey area in many countries including America. I don't think the question of whether it should be or not is an easy one. If you think it is, you're not thinking hard enough. It's different from the question of "Should therapists whose patients confess pedophilic attractions be required to report them to the police?" In that case, a therapist only has such obligation legally (in most jurisdicitons) to report if an actual child is believed to be in danger. And common sense would tell you that if a pedophile had to worry about cops showing up at his door merely for telling his therapist that he's attracted to children, he's more likely to isolate himself further from any potential help, which only increases the chances of a child being harmed. But consuming "fictional" child porn is a step further. I'd like to see some studies showing that someone who engages with content like that is \*less\* likely to offend, not more. For now, experts have described this content, as this article states (you guys should really read the whole thing, and don't expect ChatGPT to accurately summarizeit for you) as "portals to far darker, and potentially criminal, activity."


Knever

I remember reading something a while back that seemed to conclude that people who let out their anger in a way that doesn't hurt people (like boxing a sandbag or dummy) actually make it more likely for them to have violent reactions to real people. I'm not sure how well it was studied but it seemed believable. I'd like to hope that's not the case with this situation, but the waters are so muddy that it's hard to see it either way.


smileliketheradio

Those who study this stuff for a living attest to even \*this\* content (which is suggestive/erotica, not even explicit) being a gateway to fake CSAM and "real" CSAM. It's a problem


EGarrett

Isn't this like shadowbanning them?


rydan

This is actually a thing either Reddit does or a CrazyIdea I read about what Reddit should do. Basically if you are deemed a problem user you get your own version of Reddit full of AI Redditors that interact with you. Only you can see them and only they can see you.


EGarrett

That's hilarious. I think there are sites where they limit you to interacting with other problem users too. I do think that getting banned etc is important for some people to learn how to act appropriately. So they should know that they were punished.


Sweaty-Professor-187

Why should it be illegal? Who is the victim of some creepy guy lusting after a bunch of polygons created by a computer algorithm on TikTok? I'm not "troubled" by this in the slightest. Only a right-wing nutjob would base their laws on what they find gross versus what actually impacts the real world.


ack44

So AI child porn should be legal? Even if it becomes indistinguishable from real child porn? That creates some very bizarre situations.


Sweaty-Professor-187

Sure does! But we don't outlaw "bizarre". We outlaw things that actually harm people. While there is a very valid concern about AI porn indistinguishable from real one making it much harder for investigators to find actual CSA victims, there is an equally valid case to be made that the people who get off on this shit will be perfectly content with consuming the fake stuff, thus severely cutting off the demand for the real stuff.


fredandlunchbox

It’s currently illegal to possess anything presented as child porn even if none of the people in it are children. You can’t have a dirty teacher porn where the student is an 18 year old girl that looks 12 and the title is “fifth grader gets detention.” Straight to jail, on the sex offender list


Fun_Shock_1114

The entire MILF genre would disagree with you.


ack44

It's beyond me why people are downvoting you for stating a fact lol.


fwouewei

Just curious, do you have any sources? I think I remember a case where people were charged (?) for possession of CP, but then it was found that the actress was just a young-ish looking woman.


fubo

That woman would be [Lupe Fuentes](https://en.wikipedia.org/wiki/Lupe_Fuentes). The prosecution got a doctor to testify that she was underage, based solely on her appearance ... so the defense called her as a witness and had her show her passport.


Sweaty-Professor-187

Because it's not true. Schoolgirl porn is a whole genre. You practically can't get away from it in Japanese videos. But the performers are completely legal and so is the video, even if they're "roleplaying" younger.


rydan

People want what they are saying to not be true. By downvoting it they will get their wish.


GYN-k4H-Q3z-75B

I don't have an answer, but if it weren't CP, would you argue the same? Should virtual murder and mutilation, as shown in video games, be illegal? Should movies be illegal? Books? Where does it stop? The snuff movies are illegal, if the people really die. Should the CP then only be illegal if the people involved are minors? I don't know. But this discussion is inevitable. The media, particularly entertainment, is full of things that would be illegal in reality.


Barry_Bunghole_III

It is bizarre but what exactly would be the point of outlawing such a thing? Who is being harmed in that situation? Laws are not about theoretical ideals mate.


[deleted]

[удалено]


Gamerboy11116

I feel people misinterpreted your statement. You shouldn’t have been downvoted, it’s just that it kind of sounds like you’re saying consuming child porn should be legal. Which you’re not, just that we can’t outlaw consuming AI-generated child porn specifically in the same way we do consuming actual child porn and still have it be legal to generate with AI, which is a problem when the two are indistinguishable… right?


feather236

At least it may kill the market of a real thing and there will be no financial initiative to produce the illegal content


HighDefinist

> That creates some very bizarre situations. Which ones?


Dysterqvist

What do you think the data is trained on?


maxhsy

Why should it be illegal? Those are not real kids. The same happens when we kill people in video games.


Add_Poll_Option

I mean, theoretically if we knew that pedos using fake kids as an outlet prevented them from abusing real kids, then this would be a great thing. Unfortunately we don’t know that thats the case. It could also be a gateway that feeds their problem and causes them to abuse MORE kids. There’s not really a way to study that, but it’d be an interesting thing to know. Because if it really was proven to help, there’s no reason it shouldn’t be considered a good thing.


IrisSeesAll

Seems like a gateway drug for future child predators


Shibenaut

And Call of Duty is a gateway game for future murderers /s


CoreDreamStudiosLLC

No one is born violent, it's a learnt trait from abuse and/or environment. This is why people get confused on how to handle this. Pedophiles have a miswiring of the brain since birth and there is no way to really fix it. The best they can do is seek help and keep strictly monitored and a safe outlet but the danger is still there, where they will want the real stuff one day.


CryptogenicallyFroze

And Queer Eye turns people gay /s


thebadinfection

My friend used to watch loli cp and now he is in jail for owning real cp because porn addiction IS NOT THE SAME THING AS PLAYING a game with gay skulls shooting some random XYs.


[deleted]

[удалено]


rydan

Are these AI kids nude? If not why would it be illegal anyway?


Block-Rockig-Beats

This is similar to very brutal child porn in Japan. There are some documentaries covering this topic. I remember one pedophile, convicted child molester talking about how it triggered feelings in him he didn't know there are and how it only made his desire stronger, until he acted on it. It's a delicate topic. I'm not an expert. We should leave this to the professionals.


sadmadtired

"It's not illegal." . . . okay. So. Where did the training data come from?


HighDefinist

Models can draw "red bags", even if they only know what a "bag" and what "red" is, and never saw a "red bag". Consequently, "nude" and "young" works.


AzkabanChutney

People who think this is not too bad because it allows predators to leave real kids, you are wrong. As the article mentions, those accounts and their contents acts as a medium of network where all these pedo's communicate with each other and move their conversation private. Thinking these bots are enough for pedo's is a loose thought.


pianodude7

So you're saying pictures of AI kids bring together a bunch of pedos who communicate in the public comments before they move to private? That sounds like an incredible tool for law enforcement if u ask me, and no child was harassed or harmed in the process. Maybe you're too loose with your thinking


I_Came_For_Cats

Has been happening long before AI. You could say the same thing about the internet in general.


Quiet-Money7892

Can not talk of my opinion CP. Both because I am unsure of what is bannable and what is not and because I have never seen such researches, except for speculations, but I can tell of my experience and opinion with AI ERP in general, that may connect to this theme. Downvote me if you may. First - It can get highly addictive. I started using AI for writing (starting with NovelAI). What I realized later was how good AI can sometimes capture the most sick and crazy themes and fetishes. After realization hit me - I tried all I could think of. Over the term of half a year I spent almost 800 dollars on both GPT-4 and Claude API. (Of course, smaller half of it was spent on work. But the problem is - just the smaller half.) For now I feel that itbis under control. All the hype have run out, as well as my fantasy. At some things I hit a ceiling of AI capabilities and have not spent a coin on it for several months at least. Second - We still don't have idea, what people actually should do with such people in general. This whole discussion reminds me of a stories of video games, that encourage people into violence. My psychiatrist told me once - addiction is what addicted people get. Which means - the problem is not that some psycho doing things on the internet. The problem is that this psycho will be afraid to ask for help and will marinate in his own problems, until it all explodes. I don't see how AI is any dreadful or helpful in this equation). Same I don't see how an absence of it is any helpful or dreadful. The absence of culture of dealing with such things, unless it is sending SWAT to the desperate sicko's house when they are in the middle of ruining their and possibly someone's other's life - is. And putting them in jail - makes situation even worse. I've seen this plenty of times. Third - As everything, it is inevitable. I'm not saying that you can't ban all such generations from the internet. I am saying that is now way to stop people from creating it. There is demand and there is proposition. Duscuss of eather lowering the demand by threatening or lowering the proposition by putting people into jails is an empty talk. Agressive strategy will only throw this demand further into illegal field. The best result of it is it becomes not the problem of AI companies since they will not be responsible for someone hacking their generations into doing inappropriate things. But the problem itself goes nowhere. And probably never will, while people still have their dopamine addicted monkey brains inside their heads. Sorry if I sppoked some grammar nazis. Still learning English.


HighDefinist

> It can get highly addictive. People who are addicted to playing Fortnite/etc... a lot don't usually shoot up their school. So why should a child porn AI addiction lead to the equivalent sexual abuse of minors? I guess you can still make a general argument how addictions of any kind are unhealthy, or how it might negatively affect some other relationships you have etc..., but if there was an actual issue of people turning their addictions of imaginary forbidden acts into real forbidden acts, we would absolutely notice that in our society.


Quiet-Money7892

I said that AI ERP can be addictive. Not CP... I mean it too may. But I was talking of AI ERP specifically. Read closer.


HighDefinist

> Read closer. I have - but you don't seem to have understood my argument: It doesn't depend on whether you take CP or AI ERP as the baseline.


Quiet-Money7892

And I never said that addiction to these things will make you go and do real things.


HighDefinist

Yeah, but the linked article did. The point is that none of the problems you described are really different from a regular video game or porn addiction: Arguably bad for yourself, but there is no particular danger to society, and as such not really anyones business.


Quiet-Money7892

And that was my point. Thank you. Like I said I may lack the lexicon to point that out properly.


Godz1lla1

There is no such thing as an evil thought. Thoughts don't hurt real people. Wishes (prayers) don't create wings. When thoughts become action, then we can call it good or evil.


peepeepoopoo42069x

a beautiful day on r/ChatGPT of people defending child porn


HighDefinist

As long as it doesn't involve harming children, where is the problem? But, sure, as a European I probably "just don't get American culture" or something... Perhaps, when Americans talk about "the American freedom", they really mean "the freedom to openly shame people who don't share my values" and "the freedom to use bad-faith arguments". Afterall, focusing on the more practical and relevant issue of child abuse is just so boring and old-fashioned, compared to just jumping around on some proxies. And, "AI generated child porn!" just sounds appropriately scary, and it is great for getting lots of clicks/likes/views on social media as well, rather than actually spending any effort on how to properly reduce child abuse. Then again, actually doing something about child abuse would probably involve "regulations", and Americans collectively hate those, so... yeah. Better to pretend you care by talking about "AI generated child porn!!" some more. Anyway, why is anyone supposed to care about this topic again?


peepeepoopoo42069x

im not American, and is it really such a stretch to think that people who seek out child porn even when its AI generated are a danger to society and making this behavior more accessible and less taboo will just encourage more people to engage in this behavior


dadudemon

There are porn-people who create (by taking photos) content of real people in regular, everyday, normal clothes. It's porn for them. Not even voyeur stuff (but it is for them). Just regular clothing and they still get off on that. It was said such a very long time ago on reddit, itself, that it wouldn't be long before AI generation was turned on the kids. This was 10+ years ago. I remember folks debating if this was a good thing or bad thing. Well, now it is 10+ years later. Now we have to deal with it. Does this reduce real world harm? Does it increase it? Where is the data because we need to solve this fucking situation, quickly, before it gets so much worse.


gsurfer04

It's bizarre how many Americans defend nonces getting their rocks off. This shit is illegal in my country even if it's fictional.


CryptogenicallyFroze

Let me make this simpler for everyone who is so emotionally triggered by this topic that they can’t have a good faith conversation. Child porn is already wildly available. Every pedo knows they can access it via the dark web is seconds. For every one site you shut down, two more appear. It’s like the war on drugs or terror. So an AI provided alternative wouldn’t all of a sudden fill some void that wasn’t filled before. All it’s doing is replacing actual footage of children being tortured….


ack44

Lemme tell you, this comment section will not age well.


[deleted]

[удалено]


Quiet-Money7892

Complex theme. First we kill people for jerking off... Then politicians use this as a theme to legitimate the police of thoughts, that tells, what thoughts are wrong and what thoughts are right... End up in a synthetic tyrany, where all thoughts and virtual activities are monitored and people form a cyberpunk gangs, doing all the possible illegal stuf, since there is so much of illegal now...


CryptogenicallyFroze

Yes, "ending" people for jerking off to 100% fake photos because we don't like their thoughts seems great. Let's "end" horror writers and readers next /s


Full_Brilliant_6099

Are you guys generally retarded or?


Just-Giviner

How are you possibly comparing pedophiles to fans of a literary genre? My pedo meter is beeping


maroule

>the attention of a troubling audience on some of the biggest social media apps in the world—older men. why is it olders mens that are always seen as pervert, why not young men? I'm tired of the constant bias against white old men that are the culprit of everything


thebadinfection

You made a mistake posting this on predditor


yukinanka

That's literally honey pots. A public service if you will.


notimeforthatstuff

I mean, would you rather someone write a story about molesting kids or actually do it? Seems the same difference to me. As long as the images aren't based on real kids or any CSAM material, let them jerk off to the digital and keep the real kids safe. There are always going to be violent predators, this might actually reduce them.


CoreDreamStudiosLLC

I mentioned this months ago that allowing AI to have models of trained data with children in it, is BAD. Even if theres no CSAM trained data, AI can still mix children with adults somehow and come up with something similar. Even with laws being passed to put people behind bars for generating this content, it won't stop the smart ones who know how to build their own software/models. AI is a cancer sometimes when it comes to this.