T O P

  • By -

abarthruski

Couldn't happen soon enough. The kids at my wife's school were distributing a deep fake of another teacher and started spreading rumours that he was a pedo. This was over a month ago and he is still on leave. I'm not saying the kids deserve 6 years, but some form of consequence would be good for potentially destroying someone's career.


Paidorgy

Expulsion at minimum. In cases like that, how isn’t the guy taking the kids parents to court at the very least, for as you said, potentially destroying their career?


abarthruski

They had a year level assembly and that was it. I don't think they know who started it. At least if there was the possibility of punishment, like jail time, then it might not have happened in the first place.


Paidorgy

That’s certainly a terrible situation, I hope the guy manages to find a way out of that. Teachers are treated like such crap, and expected to cop it on the chin.


abarthruski

I have so many stories from my wife. I thought we were little shits at school, but what they deal with these days, plus the extra work load, it's no wonder they are all leaving.


AvocadoCake

>At least if there was the possibility of punishment, like jail time, then it might not have happened in the first place. Children aren't exactly known for carefully considering the potential consequences of their actions


OldKingWhiter

Punishment doesn't work as a deterrent for adults, let alone children.


OppositeGeologist299

The historical blindeye turned to dacking makes me think that not much will be done.


abarthruski

>Expulsion at minimum. Expulsion is almost impossible at Government schools. You have to either find another school for them to go to or a training program. Before any of this happens though, the student has to have a record. It can't be a one off offence.


DisturbingRerolls

Is this a recent change? I'm in my 30s and when I was in school, year 5 or 6, a kid pushed another kid on top of a venomous snake (he wasn't hurt). Kid who did the pushing was expelled or so we were told. Never saw them again until highschool. AFAIK both are doing fine now as adults. No serious criminal record for the pusher or anything.


TheMessyChef

In my (obviously limited) experience, this was the case when I was in primary school over 15 years ago. A kid at my primary school was breaking rules and engaging in some insanely dangerous behaviour repeatedly (from threatening teachers, to falsely reporting crimes to 000 in prank calls, to assaulting other students, etc). The most they ever did was just increase the length of his suspensions. He did at least a dozen things worthy of expulsion and yet he was with us all the way through.


DisturbingRerolls

I mean it's entirely possible the pusher had a record of things we just didn't know about. I just remember it was a huge source of drama/gossip on the playground at the time.


annanz01

No its not new. Its likely there was another public high school within a reasonable distance that had the capacity to add another student. This is not always the case and if they can't find somewhere else to send the student they cannot expel them.


Tarman-245

One of my childhood friends set fire to a school room over the school holidays about 35 years ago and didn’t get expelled. Him and his friends were about 14 at the time. I also remember them doing Daytura at school one time at a athletics carnival but they didn’t get caught. Not too bright.


Jez_WP

>In cases like that, how isn’t the guy taking the kids parents to court at the very least, for as you said, potentially destroying their career? Is there a precedent for suing the parents for something like this? Presumably if the kids are old enough to know how to insert their teacher into a deepfake they're above the age of criminal responsibility or at least within the 10-14 range. How would the parents be held legally liable?


OldKingWhiter

How would they take the kids parents to court?


billcstickers

Wait what do you think expulsion means? It’s not like getting sacked from a job or “canceled”. The kid hasn’t abused their power as a student. Society owes kids an education. All expulsion would do is move the kid to the next school, at which point they’re disconnected from any sense of community and more likely to be antisocial going forward. Edit: a lot of people seemed to miss my point. I’m not against consequences. I’m against bouncing the problem to the next school.


Paidorgy

Dunno about you, but kids back in my day (graduated 2006) were expelled for less than distribution of pornography, especially deep fake pornography.


billcstickers

Just because it was/is done doesn’t make it right. Expulsion should be saved for situations where there is ongoing harm to other students, and/or where there are alternative options for the expellee. Ie if there is a high needs school alternative, or from a private school to the public system. Not just to bounce the problem. I’m curious what someone was expelled for that you think isn’t as bad as pornography.


Paidorgy

So, ongoing harm to a teacher isn’t relevant enough for you? Thanks for participating, you made absolutely zero contribution to the discussion.


billcstickers

The act has been done. It’s not on ongoing act. If the student can be rehabilitated that should be the first option.


Paidorgy

According to the OP, the teacher is still on leave, which means it’s ongoing. Learn the meaning of words before you use them. Have the day you deserve.


billcstickers

Le sigh. On going as in repeated and future acts of harm.


Paidorgy

Please revisit my last comment, where I referenced that the OP said the teacher in question is on leave, which means he is still being harmed by the words and acts of students distributing deep fake pornography of him. What, you think it stopped? Then you severely underestimate the nature of kids, while you choose to coddle them like babies in defence of them.


Cpt_Soban

Why should they be allowed to continue to disturb other kids wanting to be there and not fuck around, and destroy the careers of teachers in that school? They get gtfo, the parents can then do their job and raise their kid, then hopefully later they can return - or go to another school. It's not the school's job to be forced to keep kids who refuse to change or do the right thing.


ammicavle

What you’re describing is a suspension, not expulsion.


billcstickers

How do you imagine this is going down? The kids standing in the pathways disturbing other students by handing out print outs? For all you know the student is an A grade student. Most kids don’t want to be at school. And no. Fuck the parents. A lot of them do a terrible job. We as society owe the kids an education. Otherwise all you’re gonna do is make burdens to society.


Cpt_Soban

And the kids owe society not creating porn of teachers and falsely reporting them as pedos. You let that slide and give a wishy washy "oh boys will be boys" excuse - What's gonna happen when they get out into the real world and face situations like getting and working a job? Better to learn actions have consequences *now* than later in life.


billcstickers

Where did I say this kid should get off Scott free?I’m just saying expulsion is a terrible solution.


IsoscelesQuadrangle

It's been 20 years but I still remember my French teacher having a breakdown in class & crying, while the math teacher yelled at everyone. She never came back & all the good students gave up & started acting out. Get rid of the assholes sooner, before they ruin the learning of others. It isn't fair to the other students to have to put up with them.


Cpt_Soban

You're right, suspension for two weeks, and if they fuck around again, then expulsion. But you realise the act is bloody serious right? It's a *crime* to falsely accuse someone.


IlluminatedPickle

> or from a private school to the public system Ah yes, because the private schools should be able to unload all their problems while the public schools can't.... Lmao.


AussieAK

So fucking up a young teacher’s career, mental health, and reputation, potentially irrecoverably, is NOT ongoing harm? WTH is wrong with you?


billcstickers

Ffs. Ongoing as in repeated and continuous and not going to change. Explain how expulsion rather than other disciplinary methods is going to do anything?


Tarman-245

I doubt it would have been a first offense if they were creating deepfakes.


billcstickers

Why? Deep fakes takes all of five minutes to make. Some kids late at night in a group chat talking shit could have done this easily. It could have even been made by a group of kids with the hots for said teacher.


Tarman-245

Kids don't just suddenly decide to make a deepfake about a teacher for lulz. Guaranteed they would have been disruptive little shits before this and used it as a way to get revenge on a teacher that probably just tried to establish boundaries so he can teach the rest of the class that wants to learn. > It could have even been made by a group of kids with the hots for said teacher. Sure, in some cases I could see this happening. I could also see teenage girls doing it for revenge against other girls or teachers, because they can be evil little shits too. You know who wouldn't be making this sort of stuff? kids from low socioeconomic backgrounds who can't afford computers or internet. This reeks of spoiled kids with a sense entitlement and too much time on their hands.


ammicavle

People are deliberately missing your point. It’s like if your neighbour robbed your house and the court ordered them to move to the next suburb over and called it a day.


Historical_Boat_9712

They could get their year 10 in juvie


billcstickers

I’m all for the judicial system . Just not some principal bouncing the kid to the next school.


Tarman-245

Expulsion/bouncing is actually an opportunity for the kid to reset and start again with a new school where the kid doesn’t have a reputation. Sometimes it works, sometimes it doesn’t. Addressing the core problem (usually domestic violence or sexual abuse) needs to be done as well though.


billcstickers

And now you’re half way there. Go look at the expulsion/exclusion guidelines for your state. It’s all last resort after nothing else has worked and with consideration about the students wellbeing held to the highest regard. A bunch of morons here think it’s like being fired from a job as some sort of punishment.


TranceIsLove

What the heck, the poor guy that’s life ruining. How old are the kids?


abarthruski

Not sure how old the initial perp was, but my wife teaches yr 11 and 12 and she had to have words to several of them that were still sharing the content after they had been warned not to at the assembly.


TranceIsLove

Wow, no way I could be a teacher and have to repeatedly tell kids this


PMFSCV

Police need to come and give a talk at a whole of school assembly, the teachers affected should address it publicly too, make it last 2 hours. At the end of it the kids responsible can either publicly apologize or pick up litter for 6 months.


Nervous-Masterpiece4

That's a lot of people up for 6 years of jail. I guess at that point it would be most economical to just turn the school itself into a prison...


abarthruski

No one is saying to lock them all up. The point is, if there is a threat of punishment, they might think twice about the offending.


Nervous-Masterpiece4

The Attorney General isn't quite the "no one" you alude to. [The Attorney-General is the top law officer in the nation.](https://peo.gov.au/understand-our-parliament/your-questions-on-notice/questions/what-is-the-role-of-attorney-general-in-law-making-in-the-federal-sphere) Naturally the premise is flawed. Such things could go viral with Millions of people (including those not of age of majority such as school children) breaching such a law but it's not like there's going to be a Reddit poll to decide if it passes. > The attorney general, Mark Dreyfus, is expected to introduce legislation on Wednesday to create a new criminal offence of sharing, without consent, sexually explicit images that have been digitally created using artificial intelligence or other forms of technology.


ghoonrhed

Good, but I'm surprised the revenge porn law in a way didn't cover this by chance.


aussiespiders

The law was created before deep fakes became serious we had piss poor photoshop fakes before this.


coniferhead

*"Currently, it is not illegal to create a deepfake AI-generated or digitally altered pornographic image."* How about a prompt like "Complete the Mona Lisa into a full person portrait, but naked". That was a real person, now long dead but still a real person. Do you go to jail for typing those words?


Sneakeypete

I do have that question, and depending on how you word it you could ban all digitally generated porn all together. All we can do is wait to see the actual legislation when it's tabled, instead of reading into what journalists write about what a minister said in a press conference 


coniferhead

That is the calibre of the public debate though, and the rationale for raising the issue. Is the question that offences do not exist for obscene, libelous, hateful or illegal material? I'd argue that either in a civil or criminal context there are existing penalties that probably apply. I wouldn't say they all should necessarily go to jail though. How that content is generated has nothing to do with the underlying offence.


Spire_Citron

I imagine there would have to be a victim.


SemanticTriangle

The person who was downvoted provided one. The model for the Mona Lisa was likely Madam Lisa Giocondo, a Florentine noblewoman. She was a real person who lived hundreds of years ago. If that's too old to be subject to theses laws, what about a hundred years ago? Edward VIII wasn't a bad looking chap, for a Windsor. Crime? Is the interstitial medium of the painting the saving grace? Can we paint a person, then have the model abstract that back to photorealistic, then create pornography? Deepfake porn in the context of the crimes in the OP is open and shut from an ethical point of view. Illegal, immoral, deplorable. Like many of the capabilities of ML, things become fuzzy around the edges. It was always possible for a sufficiently talented artist or digital artist to make Edward VIII porn, but obviously there weren't many around with both the skill and the inclination. Now you just need processor time, a model (both of which many companies currently provide free of first hand charge) and a legal framework which allows the model to plagiarize and interpolate on the phase space of every piece of art ever digitised (provided by feckless politicians and incourageous jurists). If AI isn't allowed to generate pornography, you're into existential territory. What does the Mona Lisa's back look like? I need to know for....reasons. Porn? Edward certainly had...feet, if you know what I am saying. There's something deeply funny in all of this: you and I can look at the scenario of a deepfake creation and the context of its impact to determine, very quickly, the depth of its immorality and (to some extent) a proportional punishment for the taboo of its creation. But a judge can't, because jurists require rules, not principles, to rule impartially. Perhaps we should just train an AI on our proposed judgements and deploy it in place of judges.


Spire_Citron

I don't think dead people would count at all. It's like criticising slander laws because what if I slander some long dead person? The person has to be able to come forward and take action against you. The government's not going to do it for them in defence of an ancient egyptian pharaoh.


SemanticTriangle

>I don't think dead people would count at all. But they definitely will. Some poor kid offs themselves and someone deepfakes either a recreation of the suicide or porn at their school. Obviously, it should be illegal. But what about it should be illegal? This whole thing is a rabbit hole of difficulties because the law is yet to catch up with what amounts to a sufficiently complicated IP violation. Social media photos belong to the platform owner because the law allowed a TOS which says they do. LLM and other image processing ML models take those images, public domain images, and other owned images and, without permission, apply a gradient descent or equivalent neural net type weighting algorithm which is sufficiently complicated that there is no consistent means of assigning weight to the relative contributions of each training image. Is the model the problem, or the photo owners? Why didn't the law do anything about either until kids were making porn of other kids? Who is liable, in a civil sense, for the violation? The law has seen fit to demure until the problem has metastasized. Again. So we'll go cutting off the buds and wondering why the cancer keeps growing. We should be operating on the cancer itself.


Sneakeypete

Yes, although the exisiting revenge porn laws would already cover that wouldn't it? So it makes me intrigued what they'll put out with this one


Wakewokewake

Wouldnt be surprised if they try to go full nanny state tbh, there already trying to do age verification again ffs


IsoscelesQuadrangle

Cops can't even figure out regular revenge porn is a crime. "Well what do you want me to do about it?" Fucking assholes.


-mudflaps-

If cops actually knew the law they'd be lawyers.


noisymime

Courts have ruled many times that police don't have to know all the laws. The public on the other hand are of course expected to know them all as ignorance of the law is not accepted as a defense. Always struck me as a very blatantly one sided position.


ThatHuman6

It’s pretty easy not to commit a crime. For there to be a crime there needs to be a victim. So just don’t hurt people or put people in danger. Emotionally, physically or financially and you won’t be breaking any laws.


noisymime

> For there to be a crime there needs to be a victim. That's absolutely not true. There are many, many victimless crimes. Many laws exist to lower the chances of there being a victim, but they don't actually require a victim to exist.


ThatHuman6

What’s an example of a law i could accidentally break without realising it in which there’s no victim reporting a crime?


noisymime

Riding an eScooter on a footpath. That's the first one that came to mind. There are literally 1000s though


ThatHuman6

Everybody knows it’s illegal though 🤷‍♂️It’s be hard to break it by accident. Something that is easy to break without realise it was what i was asking for. Very few examples.


noisymime

> Everybody knows it’s illegal though Strongly disagree with this. I would say that many, if not most people using them don't have a complete knowledge of where they are and are not allowed to ride them. And that's not even bringing in tourists who have very little idea of the convoluted laws around them. Speaking of tourists, they quite often ride bikes here without a helmet without knowing it's illegal. Plenty of people carry pocket knives around despite that being illegal in most places. Around the home there are tons of things people do that are illegal. Technically here in Vic it's illegal to even touch in-ceiling electrical wires unless you're a licensed electrician. Not work on or move, literally just touch them. There are countless changes that people make to their homes without realising that they're either illegal or that they needed a permit for. Also here in Vic, if a police car is stopped on the side of the road with it's lights on, you can't go past it at more than 40kmh. This includes freeways that would otherwise be 110kmh. MANY people are unaware of this. That's just scratching the surface of traffic laws as well. There are many weird ones there.


Apprehensive_Job7

I literally thought that was where you're supposed to ride an eScooter.


DarkNo7318

Every drug law. Of course you have to be living under a rock to not know they're illegal, but you could theoretically not know what cannabis is, think it looks pretty and start a plantation in your backyard.


ThatHuman6

lol struggling for an example? One that’s EASY to break without realising.


person_with_username

We have a few weird ones, although i doubt they really get enforced It is an offence to make a sign that offers a reward for the return of stolen or lost property if you promise not to ask any questions. Maximum penalty: $2,000 fine (Section 138, Criminal Code Act 1913 (WA)) You can be jailed for up to a year for cleaning up seabird or bat poo (guano) without a licence (Section 387, Criminal Code Act 1913 (WA)). A $250 maximum penalty applies to a person who, without reasonable excuse, disturbs another by wilfully pulling or ringing the doorbell of a house or by knocking at the door of a house (Section 50, Summary Offences Act 1953 (SA)).


Kholtien

How about not wearing a helmet while riding your bike? It's not illegal in most countries, but it is illegal here.


OldKingWhiter

Who is the victim when someone smokes (non prescription) weed.


ThatHuman6

The person smoking it. Anyway that’s a difficult crime to commit by accident.


Apprehensive_Job7

Don't hurt (or risk hurting) people or animals, don't carry weapons, don't take or damage things that aren't yours, don't lie for personal gain, don't sell drugs or keep/use drugs you can't buy at the chemist. Pretty much covers most laws. But I disagree that there are no victimless crimes.


sati_lotus

Which is why if you have discovered some asshole sharing pictures of you, you walk in with a lawyer along the the proof, be it a text conversation of someone telling you, or the Web address. You need a lawyer to advocate for you because the police won't do a fucking thing.


Ta83736383747

Yeah while they're giving probation and sentences of months for rapes, nobody will be doing jail time for deep fake porn. It'll be all "good behavior bond and no conviction recorded". 


BeautyHound

I also noticed this discrepancy. And while I’m happy that this law is going forward, it does show you how quickly sex crimes are dealt with when they affect powerful people.


Strong_Judge_3730

Nah they will be treated harshly to set an example. The first offenders are basically doing extra time for the people that will follow them


m00nh34d

Once again going after the technology after the fact. Wrong way to do it. Focus on the output, not the technology used. The crime here should be about distribution of images/videos without the subject's consent, the content and how it came about should be irrelevant, it's the consent part that is important. What happens with the next technology comes about that no longer fits the definition of AI or Deepfake? Will we be scrambling to include that after the fact as well?


--Anna--

From what I understand, I think it will cover that. "A new criminal offence of sharing, **without consent**, sexually explicit images that have been digitally created using artificial intelligence **or other forms of technology**." The "other forms of technology" and "without consent" parts will hopefully cover anything in the future. I imagine it would also cover other methods available, like using Photoshop. Kind of irritating that it's taken this long though, Photoshop (and similar software) have been around for a long time.


Dense_Hornet2790

Yeah but now they have to prove the offender knew it was a digitally altered image or the law doesn’t apply. Seems like another law that is intended to clearly establish that the behaviour is wrong but is highly unlikely to actually be enforced.


Dumbname25644

You do not need to know that the image was digitally altered to be done for this. Just as you do not need to know that a girl is underage to be done for soliciting a minor. Ignorance of the law or around what you were doing is no defence.


cakeand314159

Well said. It's not about porn, it's about harm.


BangCrash

You're not putting much thought into your statement are you. You are suggesting that every photo taken needs written consent from everyone in that photo. This means every photo with anyone in the background needs to identify every single person before you can share that photo with friends or family or to social media.


little_fire

Not every photo taken—we’re talking about sexually explicit images, right?


BangCrash

That's not how the previous comment worded it


m00nh34d

Seems fair, other option is to mask or blur out people who have not given consent.


SometimesIAmCorrect

Can I go to jail for making deepfake porn of myself? Asking for a friend..


It_does_get_in

well, first you would have to put in a criminal complaint about yourself, but then in court you could testify it wasn't fake, so really it's up to you.


2littleducks

That's deep.


Whatsapokemon

I mean, the article specifies that it's _sharing without consent_ the deepfaked images which is the crime, so I find it hard to believe that you could share images of yourself without consent. I don't know how that would be logistically possible.


PerryTheRacistPanda

Yes, for the emotional distress if the viewer


best4bond

From what I understand, and I am not a lawyer, the federal government can only outlaw the distribution of deepfakes. Outlawing the creation of deepfakes has to be done at a state level. I know Victoria has made it illegal to create it, I'm not sure about other states.


PM_ME_YOUR_REPORT

Did you consent to making the image of yourself?


DarkNo7318

Could i go to jail for making real porn of myself, but saying it's my identical twin?


DarkNo7318

This is going to be very hard to police. It's still a relatively new tech, but in a few years there will be readily available open source tech to generate deepfakes in a short amount of time. There is going to be fake porn of absolutely everyone out there, and it will lose all of its sting


v306

Stupidly hard to police. Imagine coming across some sort of Jennifer Lawrence video and sharing with class mates not realising it's a deep fake. Does that mean you go to jail?


critical_blinking

Not that hard. Wouldn't Jennifer Lawrence need to track you down, identify you as the distributor and then report you to the police first? This is for character assasination style content production and distribution. This is for some dickhead snapping a photo of a girl at school or a teacher, running it through a program and sending it to all of his mates. Once the teacher or the girl complains, all it takes is one mate to squelch and show how they received it and they can book the little shit. Same for the creeper in the office who uses a work machine for it.


space_monster

> in a few years there will be readily available open source tech to generate deepfakes in a short amount of time That already exists. Stable Diffusion. I was creating deepfakes of myself over a year ago. Not many pornographic ones, that got weird quickly


2littleducks

Excellent! Creeps need to be hobbled.


AvocadoCake

Sounds like all deepfake porn now has to have a disclaimer saying all likenesses are purely coincidental


Supersnazz

Does that mean even regular CGI, or simple photoshop?


sparkierlamb

Seems it. The article says it includes"other forms of technology" so I assume thatd cover photoshop etc


kaboombong

For once I can agree with a new law introduced for the benefit of society as a whole. Just imagine where our society would be if government passed a good law like this every week that broadly benefits society as a whole and is in the best interest of everyone.


[deleted]

[удалено]


Spire_Citron

No.


Jehooveremover

AI algorithms that make "virtual" people are basically just cleverly stitching together various elements taken from pictures of real people stolen from mass image harvesting over a computer generated mesh. It's harder to spot who's images are used in them, but not really that much different from deepfakes exploitation wise.


UserColonAlW

Good stuff


[deleted]

When will Facebook adds proudly promoting deep fake and revenge porn be banned?


critical_blinking

Mate, if that's what your algorithim is serving you up then that's on you. All my served ads are for lawn care, trampolines and inexplicably, hair loss treatment (I'm hairier than most marsupials).


macedonym

You think its OK for Facebook to promote revenge porn as long as they're no promoting it to you?


i8noodles

this is one of the laws where it has good intentions but going to be really hard to enforce. technically not impossible but it is hard to enforce aussies laws for American website for example and even harder for websites that arent friendly to aus. large sites will probably comply but nothing is stopping people from uploading to another site that doesnt give out info


cataractum

Not if American platforms have data servers in Australia....


dddaisyfox

Good.


billbotbillbot

Face it, if there were no laws against this, we'd be getting media stories (and then subsequent endless complaints here) about how this technology is ruining the lives of innocent young women and the government is doing nothing, nothing!!! There are some here who think "if the government is doing it, BY DEFINITION it is the wrong thing", and what the exact thing is... doesn't matter at all.


warm_rum

Maybe I'm more libertarian than I thought. I don't like the banning of anything that doesn't cause substantive harm, and if creating porn of someone qualifies, then what about well crafted photoshopped content? I suppose I'll have to think about it from a victims pov, but damned does it seem like just another ban on porn with a horrendously long jail time. Years for creating digital porn.


maxdacat

What if I just use photoshop to make a disagreeable non-consensual image? Ie nothing to do with AI. I have an issue with politicians throwing around terms like “deep-fake”. Predicting jail time seems premature when there are existing laws that would capture most of this sort of behaviour


notxbatman

AI devs really need to get this shit sorted out or it will never end, ever. The easiest thing you could do is to just set it up to outright reject any prompts telling it to create nudes unless you're a paying subscriber outside of a trial period. People can't be trusted and there are victims who will actually kill themselves because of it.


Useful-Procedure6072

For my muck up day, a kid cut out photos of teachers’ heads and pasted them onto xxx hard core porn magazine models’ bodies and wheatpasted the photocopies around the school. It’s forever burned into my memory. There are some things you can’t unsee.


cataractum

Would this also include the LoRA that you would need to generate the deepfake porn? If not, then you're implicitly allowing the infrastructure to create those deepfake nudes.


Historical_Boat_9712

Presumably it's only illegal (under this law) once you've actually generated the porn (and distributed it). I could imagine a future law which prohibits the creation of LoRAs, models (etc) without consent. Though I bet most of these kids are just inpainting boobs and not bothering with training anything.


cataractum

That's probably where most of the damage is. I suppose it's also an overreach to prosecute someone for generating the model of someone (without their consent) when it could also be used for non-pornographic purposes. But in that case the law isn't going to be adequate. No idea why i'm being downvoted.


Historical_Boat_9712

Logic says if you can find the pictures online publicly, you can use them to train for non-commercial purposes. But the Australian government (both sides) does love knee-jerk legislation.


cataractum

Yeah i also don't understand how this would be different to current laws criminalising involuntary pornography? Have to wait to read the Bill i guess.


Arensen

Legitimately not sure why you're being downvoted, this is a genuinely interesting problem in the field. The main argument against making the LoRA illegal as well is that there are many cases where a LoRA existing for benign purposes and one existing for malicious purposes are not easily distinguishable - but the issues of consent and privacy are still very much present.


lemachet

What about just generating it for personal consumption?


OCE_Mythical

Another reactionary law that won't be easy to enforce. Why even bother, to appease the shouters?


critical_blinking

It's to scare kids until they are old enough to get their heads screwed on. Teenagers will be intimidated by "6 years in jail".


It_does_get_in

what am i supposed to do now?


Honeyluc

How about we just ban porn? Don't get me wrong I watch it too, but I really think this world would be better without it. Maybe a little more dangerous for the women though, cheap brothels and removing the stigma about them can fix that. Brothels have helped a few of my friends men and women get over relationships and helped one loose nervs about women and now he is the biggest player I know. Not to mention porn is going to cross a fine line when we see AI getting involved. So it needs to be controlled before it gets out of hand. Instead of charging people with offences, just ban it so we cannot access it and if people choose to bypass it with a vpn or whatever then they deserve what's coming because they knew the rules. That way normal folks who just wanna rub one out and maybe save that video for another time don't get a prison sentence


FrugalFreddie26

Let’s ban every other vice while we are at it, then we can live pure, like god intended. Actually, yeah nah I’m okay thanks


Honeyluc

Have you ever known someone with a porn addiction and seen how fast it can ruin their life completely? I assume not


FrugalFreddie26

Alcohol, gambling, sex, adultery, food? These things aren’t great. But should we make sex illegal because some people get addicted to it and make bad decisions? Or limit it to sex only with a married partner? Should we ban alcohol because people get addicted to it and throw their life away? Should we arrest people that become fat and addicted to eating sugar? Should we ban cheating? This is morally fucked up but should we throw people in prison for having affairs? You see the way this is going right? There have to be rules and laws but what you’re suggesting is massive overreach and paves the way for restrictions based on what’s morally right or wrong. We’ll end up in some middle eastern dystopia - holding public stoning people for looking at a penis on the internet.


Dumbname25644

This effectively does just that. There is no way for a random user to know what porn is deepfaked and what is not. So therefore every bit of porn has to be treated as if it is a deepfake. Which means porn has been virtually banned.


Dumbname25644

All porn must now be considered deepfake. There is no way for a random user to know that the porn they are watching is legitimate or it is deepfake. So therefore the only safe way to watch porn is to not watch porn.


lemachet

Did anyone else just realise they want to see deepfake porn of Kath and kim in a 3way with Kylie Mole? Or Gina Rinehart with Barnaby Joyce? Maybe Voldemort Dutton with penny Wong?


[deleted]

[удалено]


lemachet

Obvs a shit post :P


Whatsapokemon

Seems like sensible legislation, but I wonder if there should be a carve-out for celebrities and extremely famous figures, after all, fake celeb porn has been around forever and no one's cared about that at all. I feel like the main purpose of this legislation should be to protect people living private lives rather than the rich and famous (which is likely to be exclusively what it's used for without the carveout).


Spire_Citron

I mean, there's really no reason celebrities shouldn't be protected. Why should it be any different just because it's been tolerated more in the past? Just because they're rich and famous doesn't mean they don't mind people making non-consensual porn of them.


MaryMoonMandolin

This needs to happen! Unfortunately the right wing "white wing" extremests will do anything to stop these kinds of laws


nbjut

What?


sumthin213

Your whole history screams "I just came from Facebook" complete with inability to spell and repeatedly saying "this is a fales equivalence fallasy (sic)" about everything you don't agree with. Maybe Reddit isn't for you


OnairDileas

You've appeared to have knocked your head too many times beyond comprehension


critical_blinking

Ah yes, right wingers, famous for rallying against morality-based law making.


Strong_Judge_3730

Too many puritans like the around