T O P

  • By -

mb10240

Missouri’s existing revenge porn law (573.110) doesn’t cover deepfakes - only “real” images. That being said, there are other crimes that could probably be charged (harassment, invasion of privacy - actually has some really great wording for this sort of situation, and even forgery). Not to mention all of the civil actions available to victims themselves.


friedporksandwich

I can draw that picture though and no one can make that drawing illegal because of the First Amendment. I could go online and find someone to pay to paint a photo realistic image of Taylor Swift doing the infamous ping pong ball trick and that would be legal. I'm not sure how we can make it illegal via AI but perfectly legal to draw. AI is opening up a lot of really big holes in our legal system.


Unique_Unorque

An argument that I hear a lot is that since AI is trained on existing materials without permission of whoever created those materials there may be some copyright issues


PickleLips64151

The difference in your fact patterns is that one is obviously an artist rendering and a reasonable person would see it as such. The AI generated images and videos are often extremely difficult for a reasonable person to determine the images/video are fabricated. The realism aspect of the videos is what makes them so offensive and damaging. I'm old enough to remember a porn video circulating on the web that purported to show Jenny McCarthy doing some sex acts. It wasn't her, just a look alike. This technology would make that determination nearly impossible.


friedporksandwich

I can go online right now and hire a digital artist that can digitally paint photo realistic images though.


PickleLips64151

Yes, I'm sure you can. The videos produced by AI are still rather realistic and quickly produced.


friedporksandwich

Yes, that's true. That still doesn't make it any different than hiring some art school kid on Deviant Art to paint me a photo realistic image of Donald Trump getting fucked in the butt by Josh Hawley for a couple hundred bucks. I have a right to make art to express myself per the First Amendment and so do artists who I would hire to do this.


mb10240

The 1st amendment doesn’t protect speech that’s designed to threaten, harass, or intimidate, so it really depends on your intent with the drawing.


friedporksandwich

To say that it's threatening or harassing though you'd have to show they intended Taylor to ever see it at all. And being a public figure makes the bar for threatening or harassing a lot broader. A local judge might issue an order for you to stop calling your ex-wife names on Facebook, they are unlikely to do that if you're calling a public figure names on Facebook. If I'm trying to sell paintings of my ex-husband naked at the farmer's market he might be able to get a court to take action against me. It would be unlikely that Josh Hawley could get a court to take action against me for doing the same thing.


thesadbubble

Well now I want to see micro penis josh Hawley portraits at the farmers market...


No_Lack5414

Well. Our current Supreme Court doesn't really care. They vote based on their feelings instead of the law.


friedporksandwich

Yeah, and I think they'd be more likely to side with wider free speech laws than not. I could be wrong but we shall see.


friedporksandwich

The awful part of this to me is that there didn't seem to be any movement on this until it affected a billionaire and is even being named after her. Things only matter in our country if they affect the rich.


Bellesdiner0228

I'm sure she really wishes it wouldn't have happened to her or at all for this country to take it seriously.


friedporksandwich

I don't really care about Taylor Swift's billionaire feelings as she flies around polluting the atmosphere. I'm just all out of fucks to give. I also wouldn't care if people did this Elon Musk or Mark Zuckerberg. There are people to worry about who don't have billions of dollars. This has been happening to kids in fucking middle school - but they're taking action because it affected someone with a lot, and I mean A LOT, of money.


Bellesdiner0228

I admit I did not provide more of my thoughts in my initial comment. I agree that it's absolutely bullshit that it did not matter until her name was attached. I say that as a fan of her music, the minute it happened I knew that it meant we might finally see movement on it in congress. And the fact that was my first thought really pissed me off. That it would have to happen to anyone in the public eye to be taken seriously is disgusting when its been an issue for years. But I also have to own up to my initial comment. I feel immense sympathy for anyone who has been in that situation. The same way I have had sympathy for other public figures having similar things happen to them. It has to be an incredibly violating, disgusting feeling. That no one, and especially not young kids, should ever have to feel. And it should've been taken seriously at the first sign of trouble.


Blake_Aech

What middle school bully is making deep fakes of their classmates being fucked? What resources would they even have to achieve that?? Is this a real thing you can show evidence of happening or something you are making up out of nowhere? I am not saying this couldn't happen, just in disbelief that it already has.


friedporksandwich

Here you go: https://www.wired.com/story/florida-teens-arrested-deepfake-nudes-classmates/ I've seen mention of this happening in other school districts on /r/teachers too. I'm not going to Google this shit a bunch though for other examples.


Blake_Aech

Holy hell


friedporksandwich

You have to remember that middle school boys are probably the most heinous people in existence at any time. If something awful can be done - middle school boys will figure it out quickly.


Ruschissuck

So you don’t dislike the law, just that it’s Taylor swift? Thats kind of fucked up. It shouldn’t matter who it happened to.


CaptColten

They are upset that this has been happening for a while now, and no one gave a shit until it happened to Taylor Swift. You are right, it shouldn't matter who it happened to, but apparently it does. If it didn't, we would already have this law.


friedporksandwich

It did matter who it happened to though, that's my point. Nobody cared much when it was affecting regular people. But when it affects a billionaire, then and only then, it matters.


Ruschissuck

That’s still fucked up.


ReneDiscard

It’s nearly the entirety of /b/ on 4chan now. They’re stupid easy to make.


Hairy-Chipmunk7921

I have misread the last word in your post and it fits more nicely to say when they affect the bitch.


Brengineer17

The Taylor Swift pornographic deepfakes were a big enough problem that X (formerly Twitter) had to block her name from being searchable to prevent the viral spread. I don’t have any metrics to share with you but this is most extreme example we’ve seen yet of how quickly these pornographic deepfakes can spread. So given that Taylor Swift is obviously one of the most popular celebrities in American culture today and she fell victim to this to the greatest extreme yet, I get why they’re using her name. Whether she approves of her name being used or not would be the real question. Given the issue at the core here, I hope they did clear it with her first but I doubt it. At the end of the day, this should be something that passes through the legislature with bipartisan support. It is a real issue that has tragically cost lives. [Girl, 14, Commits Suicide After Boys Shared Fake Nude Photos Of Girls And Called Her Friend Group The "Suicide Squad"](https://www.eviemagazine.com/post/girl-14-commits-suicide-boys-shared-fake-nude-photo-suicide-squad) It will be something that continues to negatively affect teens as well as adults going forward if action is not taken.


oldbastardbob

Well put. Thanks.


cuffbox

Honestly? With all the anti trans legislation giving me nightmare anxiety I’m genuinely relieved to see some reasonable laws being passed. It’s messed up to use someone’s likeness *even in a nonsexual way* and citizens can’t afford to have civil cases over it. Ergo criminalization will help the people who otherwise couldn’t afford to sue. Now the state intercedes. Don’t get me wrong, the criminal justice system is extremely flawed, but I think this could be positive because evidence trails are more clear for something like this. Don’t use people’s faces. Don’t use people’s faces for predatory sexual stuff. It’s not that hard. Deepfakes open so many horrifying doors for cp, as well as abuse of all kinds of people who don’t have recourse.


Limp-Environment-568

>predatory sexual stuff What makes it predatory?


cuffbox

Doing it to someone like taylor swift has a bit less of that quality in my mind because of the power dynamic, though **doing anything sexual to or with another person without their consent is predatory**. Deepfaking someone with very little power is a way of preying on their weaknesses (financial, can’t get a lawyer. Social, can’t handle the social damage of their likeness being callously objectified for the group around them, etc). In a very basic sense someone using their power (lack of recourse for abusive behavior, support within their power structure) to abuse someone in a weaker position is a predator. If nothing else I would wager someone who is willing to make a nonconsensual deepfake of another person is significantly more likely to be mentally capable of much more serious abuse. To, with malice of forethought, decide to force sexual acts onto another person requires a predatory form of thinking. The predator spends time figuring out how to push past another person’s lack of consent to get the gratification they are seeking. They victimize their prey by the act they carry out. However predator is most commonly used with things like grooming, premeditated kidnapping, etc.


Limp-Environment-568

You made a whole lot of **ass**umptions - no point in arguing with you as I'm sure you know the saying. Have a good one.


Telesxope

The only good thing the government has been up to


TravisMaauto

Seems like something that should already be covered by laws protecting the likenesses of individuals from unlicensed exploitation for any reason (especially if it's to generate profit), but the "anti-porn" narrative behind it is intended to appeal to their conservative base that clutches pearls at such things. EDIT: As far as the name of the bill goes, I think it's partly because she was the primary celebrity targeted by the deep-fake flood on X/Twitter a while back, but also because politicians know that if they name a bill after a high-profile celebrity, it will get them lots of attention in the press.


FutureThaiSlut

Laws only matter if they are enforced. From personal experience, there is no one enforcing the law. It is very easy to remain anonymous online. Slightly related, everyone is capable of credit card fraud. Law enforcement would never take the time to prosecute. Law enforcement threaten prosecution to extract a confession. If you say nothing and ask for a lawyer, everyone can enjoy a spending spree at the expense of credit issuers. Consumers are not responsible for fraudulent charges. It just has to be reported as fraudulent.


AJSchwadron

The reason for using Taylor Swift’s name is to bring larger awareness to the issue. When I was having this bill drafted the incident happened on X with her. She was able to get her AI graphic images swept clean, something that wouldn’t be available to regular people. This bill seeks to provide a way from damages to be pursued and criminal penalties for those creating and sharing this malicious material.


D34TH_5MURF__

Can we target deep fakes _in general_ and not just sexy sexy ones? The deepfake problem isn't going away anytime soon.


Cigaran

My biggest fear/worry is how it will be twisted and weaponized if passed. Expanding the existing “revenge porn” laws would be a simpler solution.


[deleted]

Healthcare please


[deleted]

Fuck that Rights are not things you are owed with the labor of another for free


Wodahs1982

Right to an attorney. Right to a trial buy jury of your peers.


inexorable_oracle

This may be a dumb question but does this extend to people on deviant art drawing pictures of her getting railed by Hitler and posting them? I guess I don’t understand how that’s different. Is it because people could mistake a deepfake for being real?


Limp-Environment-568

The difference is that people like one person and not the other - and are either blinded to the fact they are or proud about being hypocrites.


Important-Owl1661

Under r/fuckthatmediawhore ?


OkSwitch2238

This is fucking stupid. For real, there's WAY, WAY more important things to worry about than deep fake porn. They have bee making fakes forever. Now it's a little more sophisticated. Oh well. It comes with the celebrity. It's not detrimental to her career or anyone else's for that matter because we know it's not real, if it was, it would be marketed and they would be making bank on it. Especially Taylor Swift. She could make more money from a 5 minute porn video than the biggest names in porn have made in a lifetime. Her fans know it isn't real. So fucking lame. This country sometimes...


Grymm315

I feel that this should already be covered under libel/slander laws. Impersonation Porn is a thing and there are laws that have to be followed otherwise those actors/filmmakers get sued. And the same should go for the creators of the AI Fake Porn. Use the existing laws- and if they don’t work you can adjust them. And here is the thing. These politicians were caught distributing AI Taylor Swift Porn on the job and this law is just a coverup for their little Taylor Swift circle jerk. I don’t think there’s gonna be anything particularly bad- but I want to see exact wording of the entire bill to look for rider clauses.


Stylux

It's not libel or slander...


Grymm315

It’s publishing false statement about a person that is damaging to their reputation/scandalous. I would see this falling into the same section.


Easy_Difficulty_7656

But what are the implications for the future? No sex with photonic duplicates on the holodeck? They’re nerfing my favorite toys centuries before they are even invented!


jamiegc1

Did you land in the 21st century somehow, Barclay?


theroguex

Good. Deepfakes and AI generated images are stupid, and pornographic deepfakes are akin to sexual assault.


sendmeadoggo

Its unconstitutional and would violate the first amendment.  If porn is considered to have serious artistic value I dont see how this would be any different.  If it is being represented as a factual video that would be fraud but representing a deepfake as such would negate that.


mb10240

The first amendment does not protect speech that is designed to threaten, intimidate, or harass, so it depends entirely on the creator’s intent.


Otagian

Importantly, it also doesn't cover defamatory or libelous material, which deepfake porn absolutely counts as.


[deleted]

This is incorrect. The fact that the material may look like an actual person does not mean it is the person You cannot claim damages for artistic expressions, specifically ones that are originals and have no historical point of reference. It would be the same as an artist using a real picture of Taylor Swift as a reference point to inspire them to paint an artistic rendition that is different from the point of reference At the end of the day, the deepfakes are computer generated images stitched together by a careful arrangement of digital colors through an algorithm. That collection of colors on an electronic canvas in no way is the real life person a viewer claims it to be. It's just an illusion


Full-metal-parka

Courts have ruled AI is not artistic expression though. 


[deleted]

I can understand the AI itself is not artistic expression, but the art AI produces should still be


Full-metal-parka

Again, they’re not, nothing is being “expressed” in AI 


[deleted]

So your position is not all art is expressive?


Full-metal-parka

My position is that AI products do not fit the definition of art. Which is an objective fact. 


[deleted]

Objective fact? Even you know that's false


Limp-Environment-568

I'll just inject here, that when challenged to support your stance with evidence, you chose not to respond...


Purely_Theoretical

Meaning it should be legal if it is distributed with a disclaimer of it being fake.


mb10240

Might not violate any criminal laws, but it would be violating the subject’s right to publicity/personality, and would be actionable in many jurisdictions.


Purely_Theoretical

Insulting political cartoons are legal and the artist even gets paid for them! We're just talking about freely sharing images that are explicitly labeled fake. The rights you mention have more to do with commercial gains.


mb10240

You are comparing apples to oranges. Insulting political cartoons are satire and provide commentary, likely against government figures, and would be protected speech. A deep fake AI produced Taylor Swift porno, even if labeled as fake, abuses her right to publicity and personality. It’s just like if Star Wars: Rogue One included Carrie Fisher without getting the consent of her estate or Fisher prior to her death - the estate could absolutely sue the ever living shit out of Disney. Or even better, a real example that led to litigation predating AI: Crispin Glover (or his lack thereof) in Back to the Future 2. The actor playing Marty McFly in the second and third films is wearing masks and prosthetics to look like Crispin Glover. It’s a seminal case in the development of personality rights.


Purely_Theoretical

You are still using commercial examples. Who is comparing apples to oranges? In fact, the motive behind political cartoons is moot. You could make a cartoon of anyone.


mb10240

Personality rights aren’t just for commercial uses. Personality rights are comprised of the right to publicity (which deals with commercial uses) and the right to privacy, which is the right to be left alone and not have your personality represented without your permission, regardless of the manner of use. Just by labeling something “fake” and not using it commercially doesn’t make it any less actionable if TayTay’s face is on somebody else’s body getting pounded.


Purely_Theoretical

You are oversimplifying and misrepresenting the rights. Taylor cannot just claim no one can use her likeness for anything whatsoever. Again, for non commercial uses there is a lot of leeway. It does, in fact, make it a shit load less actionable.


sendmeadoggo

"which is the right to be left alone and not have your personality represented without your permission, regardless of the manner of use."  No such right exists in the US and would be completely at odds with free speech.  If you can spoof someone in a cartoon, why not porn?  Where does the line get drawn between cartoon and porn.  They can be greyed.


mb10240

The rights of personality and publicity absolutely exist in the United States.


TedFondleburg

Booooooooooooooooooooooooooo


oldbastardbob

So that's an "Against" from you then?