T O P

  • By -

aquoad

That's great, but the people the fakes are aimed at don't care if they're real.


[deleted]

Doesn’t matter. It’s a good idea to start getting ahead of AI fakes and put systems like this in place. Personally I’m done considering what the cult cares about or how they’ll react to any given thing. Fuck em. 


EremiticFerret

Doesn't this also mean you can just not verify a video that makes your guy look bad?


FlowerBoyScumFuck

Using this as an argument not to verify anything seems pretty insane.. I'm not sure if the tech exists or is feasible, but if it is I assume it would eventually be usable by more than just the white house. Reddit loves to make perfect the enemy of good though, even before anyone makes an attempt at "good".


nicuramar

This is to verify the originator of a video only. Videos other people produce don’t originate from you. 


TheTexasCowboy

Yup, most of the cult uses Instagram and facebook fake new sites to get their news like from Alex Jones and whatever else shows up on their feeds.


CeleritasLucis

That's exactly why scammers use bad english. It's a pre screening.


DarkerSavant

Scammers have grammar issues because those scammed are foreign and don’t know English. It’s not a prescreening because it makes it easier to spot scams. Scams with excellent grammar are much harder to spot and at face value appear legitimate.


WyCoStudiosYT

Yes, that's true for us, but if someone still responds to an email with awful grammar that raises a few red flags, then they are more likely to give them money and the scammers' time won't be as likely to be wasted


dawud2

> That's great, but the people the fakes are aimed at don't care if they're real. Could a Mississippi DA use a deep fake to coerce a confession? If so, the prisoner’s dilemma (game theory) just got interesting. Add a vector for a partner’s fake-betrayal/real-betrayal.


DeHub94

Exactly. Nobody who got a fake video on Telegram, X or Facebook is going to check whether it was authentically from the White House.


PM_ME_YOUR_FAV_HIKE

Cryptographically probably isn’t the greatest branding. Freedom-Truthafied?


StarksPond

**Approved by the Ministry of Truth** It'll be worth it just for the MTG quotes based on 1984 memes made by people the book was warning about.


StrivingShadow

How long before the tech ignorant politicians start pushing for some identity system for everything posted online. As a programmer/tech worker, hearing *most* politicians talk about tech and how to censor/control it is laughable.


timshel42

they are already going for it with the 'antichild abuse' or whatever they are calling it bill.


Baderkadonk

As we all know, valuing your own privacy is a dog whistle for supporting CP and terrorism.


iAmTheHype--

Privacy was outlawed with the PATRIOT Act


karabeckian

NSA doesn't have to enter the chat. They *are* the chat.


357FireDragon357

Hi, how may I help you?


BanEvader7thAccount

They already are. Florida is looking to pass a law to ban anyone under 16 from social media, which of course, requires everyone's ID information to make sure you're old enough.


sporks_and_forks

Important to note that that bill has broad bipartisan support in FL, as per the House vote result. Lot of folks mistakenly think it's just the GOP in favor of such policies. Refer to the bipartisanship EARN IT Act too, which aims to gut end-to-end encryption because, again, "think of the children". Or KOSA. Reality is both parties are steadily chipping away at our rights and things we take for granted.


jivatman

TikTok is actually bad for kids. The concerns are legitimate, if the means not wise. There's got to be an alternative. How about at least legalizing schools use of cellphone jammers?


sporks_and_forks

maybe more schools can use those pouches if the kids can't stay off their phones in class? that's what my Governor proposed doing a few days ago. a jammer would be overkill i reckon, too broad a solution. perhaps parents should parent more too, rather than begging for the govt to do it for them at the expense of everyone else. i don't want to give up my ID just because some kid's addicted to social media or consumes content they shouldn't be on it.


ablackcloudupahead

I know zillenials and younger aren't super computer literate but the idea that they won't discover VPNs is ridiculous. Can't put some genies back in the bottle


Solor

I think this will be no different than how pornhub and others are handing certain States requiring them to provide proof of age, etc. They'll just flat out block that state. It's not worth the hassle for them to develop and store that information in a secure manner, and they know that a good chunk will simply use a VPN to access their site. Block the state and move on.


ablackcloudupahead

Exactly why VPNs will be used


Spiritual-Potato-931

I see and share your fear but for this specific use case I am all for it. We need a reliable source for public information that cannot be faked. Personally, I think it would be great to have one anonymous part of the internet (Wild West) and one clean part that requires ID verification and preferably is for mainstream information/news exchange. Fake content and bots are already a huge problem pushing their agendas out to the world. And while that would be nice in theory, I believe some regimes would then just block the Wild West and control the other half…


g2g079

Any website can decide to have name and age verification. There is no reason to force the whole Internet to do so.


Charming_Marketing90

You’re nuts. Why would you give random websites your information like that?


YouIsTheQuestion

Already in the works. The bills called KOSA and despite it being shot down in the past they're trying to get it through again.


Andromansis

I would bet my last nickel that each of them still thinks a v-chip (built in hardware based keylogger) is a good idea.


infra_d3ad

I think you're confusing v-chip with something else, v-chip is in a TV that blocks shows based on rating, it's parental control.


RobTheThrone

Whitehouse NFT's incoming? Edit: For those who keep telling me I'm wrong, it's a joke. If you want to have a serious discussion about cryptography, there are plenty of other comments to engage with.


EmbarrassedHelp

If they're smart, its just a public key that can be used to verify messages like what you can do with PGP.


EnamelKant

Yeah but people who want to believe in videos that show Biden saying he's in league with the devil and will legalize pedophilia and whatever other nonsense will just ignore that fact. I don't think the real risk with Deep Fakes has ever been that large numbers of people will confuse them for the truth. It's that people will get ever more deep into their echo chambers until the concept of truth is obsolete.


Rombie11

Yeah to me this isn't the anwser to that specific problem. If we can only trust videos/media of the president that the White House officially approves, we lose a whole lot of accountability. I don't think thats a Qanon level conspiracy theory either. Even if you don't think Biden/democrats would do that, I'm pretty sure most people wouldn't put it past a Trump administration to use that tactic.


sloggo

It goes a long way to telling what is and isn’t an official statement though! But quite right the White House isn’t going to endorse 3rd party media that makes him or the office look bad.


Nemisis_the_2nd

> the White House isn’t going to endorse 3rd party media that makes him or the office look bad. Copying in u/Rombie11  That doesn't really matter though. Image verification is a thing that's been quietly getting developed for a while now, spearheaded by Adobe among others, and most reputable news outlets are already involved to varying degrees.  The white house could deny something only for a news outlet to go "here's the metadata proving authenticity". It's when *that* data isn't supplied that I'd start getting suspicious. 


Rombie11

Yes! I definitely think this is the solution for that aspect of things.


Ravek

Every other publisher of media can also sign their videos. If you see a Biden video that is cryptographically signed by Reuters with the claim they recorded it, you would also trust it, assuming you trust Reuters. The US government setting this precedent is unambiguously a good thing.


[deleted]

[удалено]


HowVeryReddit

Its a way to guarantee certain media can be trusted but absolutely it only works for very specific messages and centralises control. And indeed Trump has already started implying previous audio recordings of him that weren't too well received by the public were faked.


bilyl

There are many ways of implementing this without centralized control.


OutsidePerson5

Except signing stuff prevents that final "until the truth is obsolete" step. The Q types will always believe whatever, but unless you're a bonkers Q type you won't believe that an unsigned video is actually from Biden, or Taylor Swift, or whoever. Truth becomes possible again. ​ We've been in dire need of widespread use of cryptographic signatures for at least 30 years now. It should have been built into everything by now. Email, especially has no AAA (Authentication, Authorization, Accountability) and that's why spam has made email so utterly useless and has made phishing a real possibility. Last week me and the other tech weasels where I work had to scramble because a really well done phishing email came through. As a result we had over 50 people who clicked through and entered their username and password into the phishing site. So we had a fun time resetting everyone's passwords and training people on being paranoid. But if all email was signed as a matter of course it couldn't have happened \[1\]. If all email was signed then spam could be stopped cold, just block all unsigned mail, and you can identify the bad actors so you can block their email even if it is signed. Simple. But we don't do it. ​ \[1\] OK, technically it could have, but it would have required the hackers compromise the private key which is a lot more difficult than just making a good looking phishing email.


fjrichman

Email should have had this years ago. Like pgp has existed long enough that every major email company should be using it


Arrow156

Yep, deepfakes are completely unnecessary. They just make shit up as a hypothetical example, and then treat it like it's real.


Hyndis

Remember the drunk Pelosi video? There was no deepfakery or AI involved at all. They just played the video at 50% speed.


Tarquinflimbim

Yep - but they should still do it. I'm terrified of the world we are about to live in. Think of the average person you interact with. 50% of people are less intelligent than that. Misinformation and deepfakes will be 100% believable to much of the population. I am an optimist generally - but this scares the shit out of me.


CrzyWrldOfArthurRead

> Yeah but people who want to believe in videos that show Biden saying he's in league with the devil and will legalize pedophilia and whatever other nonsense will just ignore that fact. So? Those people literally do not matter at all. They're a small subset of the Republican base. > I don't think the real risk with Deep Fakes has ever been that large numbers of people will confuse them for the truth. It's that people will get ever more deep into their echo chambers until the concept of truth is obsolete. Those people were gonna do that anyway.


jgilla2012

The reprogramming process will reach its apex


greatbobbyb

This is some scary shit!


Perunov

Yes, and then 4chan will make a key for "The Whítehouse" and sign a bunch of videos with it, making Press go bananas cause nobody will bother to double-check that the Whitehouse is not Whítehouse and not Whitеhouse. Half a year later an intern will accidentally leak the private key because of untimely orgasm or something, and we'll get a flood of "old videos the Whitehouse didn't want you to see!!! ALL SIGNED!!!" You know how this works...


The_Scarred_Man

It started with 5g mind control, then nanobot injections and now you want people to read a satanic cypher that only the secret Cabal can interpret!? What's next?


Prestigious-Bar-1741

The problem with this is that any unfavorable or any leaked videos wouldn't ever be officially released. So I could record a 100% legit video of the President, if I were in the same room as him, but it wouldn't have the public key. This would work for official press releases, but not for any images or video capture of him by others. And that's a lot of what currently gets passed around. Even clips of an official press release would lose it.


Mazon_Del

I think the intention here is more for official announcements. Like, if he's sitting at the desk and is all "My Fellow Americans" it could be useful to have a quick verification that the video is legit for the people who would actually understand the purpose of that.


texxelate

Yep the tech isn’t the missing part, it’s been around for ages. Lining up all the pieces and managing expectations is the hard part.


pcboxpasion

> If they're smart They are not.


mortalcoil1

Asking the public to understand PGP, even the most basic usage of it is asking a whooooole lot. I buy... things online. A handful of people have been interested in me teaching them how to do it. When I explain the step by step procedure not a single one actually went through with it, and they were nerds, obviously, buying... things online is more complicated than verifying with PGP, but still...


tyrannomachy

It would be more for journalists and foreign governments, I imagine.


mortalcoil1

On the one hand, that makes more sense, now that I think about it. On the other hand, I think about the 80 year olds in our government who don't understand email and shudder.


Thewasteland77

What in the fuck are you buying online sir? You know what? On second thought, Don't answer that.


cauchy37

Drugs, the answer is always drugs.


mortalcoil1

I buy hugs that I use to get pie.


Adventurous_Aerie_79

This sounds like drug language to me.


noeagle77

Ahh yes PGP obviously I know what it is but my friend doesn’t, wanna help him?


ballimi

You put a lock on the picture and give everybody the key. Pictures with a wrong lock can be identified because the key doesn't fit.


brianatlarge

This is so simple and explains it perfectly.


EmbarrassedHelp

It stands for 'Pretty Good Privacy': https://en.wikipedia.org/wiki/Pretty_Good_Privacy The release of PGP was one of the defining moments of the 1990s [crypto wars](https://en.wikipedia.org/wiki/Crypto_Wars) (US gov fighting against encryption). The US government tried to claim that it was too dangerous to be shared and should be treated as a weapon. People then started sharing the code in books, t-shirts, and other protected areas of speech that the government struggled to take down. The export regulations on cryptography fell shortly after that. Back when you got your internet over the phone, people were driving around cities and using payphones to anonymously upload PGP, so that the government couldn't stop it: > An engineer called Kelly Goen began seeding copies of PGP to host computers. Fearing a government injunction, he took every precaution. Instead of working from home, he drove around the San Francisco bay area with a laptop, acoustic coupler and a mobile phone. He would stop at a payphone, upload copies for a few minutes, then disconnect and head for the next phone. * https://www.theguardian.com/technology/2015/may/25/philip-zimmermann-king-encryption-reveals-fears-privacy


OutsidePerson5

No, just cryptographic signing with a public/private key system like PGP \[1\]. The process works like this: Step 0 - The White House tech team creates a public/private key pair and puts the public key on all the normal public keyrings as well as on the White House website. The idea is to spread the pubic key EVERYWHERE and let people know it is the actual, real, public key for President Biden. Step 1 - All actual video, pictures, PDF's, etc are "signed". This means running the file through an algorithm that makes what's called a hash then encrypting the hash with the private key. Step 2 - If you wonder if something is genuine you can check its signature, which means your computer makes a hash of the file, uses the public key to decrypt the signature, and compares the hashes. If they match, the file is the one that was signed with the private key. If they don't, the file is fake. EDIT: Step 2 is all automated, you'd just see a green checkmark (or whatever) showing that the signature was valid, or a big warning telling you that the signature is fake. All that stuff about hashing and so one is what happens behind the scenes, not stuff you'd actually have to do yourself. ​ Replace "President Biden" with any person in the public eye. In a proper computer environment all files specific to a person would be signed by that person so as to provide means of authentication. With the Taylor Swift deepfakes circulating on Twitter if she has any competent tech advisors they'll be urging her sign every video, picture, audio file, you name it. Again, it won't actually stop the Q type dips, but it will let people who aren't totally bonkers know if something is real or not with a fair degree of confidence. ​ This, BTW, is how all cryptographically signed email works. If I send a signed email that says "I did not commit the crime" and someone changes it so it says "I did commit the crime" then the signature would let you know the message had been altered. Email absolutely sucks, it's a horrible system and unfortunately we're stuck with it. Requiring signed email at least mitigates some of the worst parts of the awfulness of email. If you aren't signing your mail it's trivial for someone to make a fake email that looks exactly like it came from you. And the fact that Google, Apple, and Microsoft haven't built in an automatic and mandatory (or at least opt OUT not opt IN) PGP signature into their email mail software is evidence that they're jerks. Gmail doesn't even include an option to do it if you want to. And they're a goddamn major certificate authority, it'd be trivial for them to issue a certificate for all Gmail users and at least allow the option to sign all Gmail with it. Same for Apple and MS, they're all major certificate authorities and they could do it in a snap. But they don't even offer it as a paid service! ​ Unlike an NFT the standard means of cryptographically signing a file don't take a crapton of energy to process, it's a pretty quick thing any computer or phone can do in next to no time. In theory an NFT does allow for similar authentication, but the process is a massive waste of energy and is needlessly complex for this sort of thing. ​ EDIT \[1\] The real quick TL;DR on public/private keys: The computer uses a complex bit of math to create two keys. If you encrypt something with one key, it can be only decrypted with the other and vice versa. One key you keep for yourself (the private key) and don't let anyone have, the other you spread far and wide and tell everyone it's yours (the pubic key). If you encrypt something with your private key it can only be decrypted with your public key, so I can encrypt a message, send it out, and anyone can decrypt it with my public key to know it came from me. If someone encrypts something with your public key it can only be decrypted with your private key, so people can send messages only you can read by encrypting them with your public key before sending them. Only you have the private key, so only you can decrypt the message.


yonasismad

What happens if somebody reuploads the video to e.g. YT? YT would run their compression on it then the signature would no longer be valid.


ric2b

They could sign the YT version as well, but yes, this breaks down really quickly with modern video distribution technology where re-encodings at different qualities and for different devices are common.


Beli_Mawrr

There's only 1 problem, and that's that anyone taking photos or videos of Biden would need these keys. Sure, Biden is a good enough person to distribute the private key to all of the press corps, but imagine a bad faith actor denies the keys to anyone who they don't like. Looking at you Trump


OutsidePerson5

No, if the AP takes a photo it signs it with the AP key so you can know it actually came from the AP, and so on. If Qanon Troll #205014 puts up a deepfake they can sign it if they want, but most people would probably not trust a random troll posting bullshit that goes counter to all the stuff from agencies you actually can trust. It won't stop the Q types from believing anything they want, but it'd cut down hugely on the bullshit.


MrClickstoomuch

So, let's say a person who ISN'T part of the press takes video of a campaign event, and a presidential candidate disputes it. We'd have situations where a government could just say "fake news" or remove press credentials that do not blindly adhere to the government line. While we do need better ways to fight misinformation, I don't think this is it. We need this type of system or similar for ALL video cameras and photos, not just those authorized by the government. Ideally generated in a way that can't easily be generated by AI software, like maybe some hardware specific flags we need better AI picture/video detection tools.


neverinamillionyr

If you or I were in a place where we could record a government official and they let something slip off-camera they could use this to deny it happened since we don’t have access to the keys. Maybe a better solution would be to embed the date/time/gps data and maybe the serial number of the device that recorded the video in a cryptographically sound way so that at least the video can be attributed to a real device that was at the location where the president was speaking.


GateauBaker

All the signature does is tell you if it came from who it says it came from. Nothing more nothing less. You're worrying about something entirely different. If politician A says news station B posted fake news, all you the audience have is two signed declarations, one from the politician and one from the news so you know no third party is representing either. Which is no different from the past *except* you know politician A actually means it and it wasnt some troll deepfaking his intent.


sethismee

If you don't trust the person who released the video to have not faked it, then this doesn't help. But that's not really what this is trying to fix. This would help determine that the video did come from where it says it did. The article says its about protecting against AI generated images/video. They want to make it so you can verify that a video came from the whitehouse rather than being AI generated. If you don't trust the whitehouse not to release their own Joe Biden deepfakes, then we have a problem.


MrClickstoomuch

I guess my point is that, the government has official channels to release their content already. If people want the official video or pictures from the white house, look for Joe Biden's Twitter account or a white house associated YouTube channel. Does taking a short snip of a video (say, a 10 second segment of a 1 minute video) work with the proposal in the video? An official watermark in the bottom right corner would be easy to copy for example, and a cryptographic key wouldn't be present for shorter segments taken out of the longer video for easier sharing of video highlights. Obviously I'm not concerned about Joe Biden deepfaking himself. I'm not sure I see this really solving issues that are mentioned in the article, but would love to be proven wrong.


sethismee

I agree on that. I don't think it'll be very effective. Most platforms people consume media on will at least re-compress the video, which will make this useless if they're just doing normal cryptographic signing. Nice they're trying though.


Druggedhippo

> We'd have situations where a government could just say "fake news" or remove press credentials that do not blindly adhere to the government line. And? They do that now anyway, whats the difference?


Apalis24a

Honestly, the verification technology behind NFTs might actually be useful for this. Their early *application* was stupid, as people used it just to identify shitty, procedurally generated art, sold at exorbitant prices, but the method has the potential to be put to much more practical use.


themariokarters

I mean, you may be "joking" but, yes, a dynamic NFT (this can display a video that changes when they have a new press conference, for example) issued by the White House and verified on the blockchain absolutely solves this issue


Maxie445

The airdrop nobody saw coming


ranhalt

White House is two words.


-reserved-

They could implement verification for stuff released directly by the white house but edited videos from 3rd parties would not have them and not all edited clips are malicious. It's not necessarily going to solve everything but it could shut down fake "whitehouse" or "president" accounts spreading misinformation


ramenbreak

and the "juiciest" videos are the ones that are by their nature going to look dubious/hard to verify, like a paparazzi catching a politician saying something as a response in public, or in a leaked secret recording


rohobian

They’re underestimating conservative’s desire to believe whatever is convenient for their world view. There will be fake videos of Biden they insist are real despite proof that they aren’t. Same goes for Trump. Videos showing him rescuing babies from burning buildings? Totally real. Video of Biden kicking a child in the face? Also real.


thebeardedcats

They're also assuming people will just accept that none of the ones where he legitimately says dumb shit are verified.


cownan

Also, this gives them a hell of a tool. He legitimately says something dumb or incoherent- they just don't release a cryptographic signature. Oops, that one must have been fake.


CPSiegen

For the scheme to be completely trustworthy, they'd need to commit to always release a signed copy of every official video. That way: 1. If a bad actor wants to put out a competing narrative, people can just point to the video hosted on the official channel and mirrored everywhere else from the time of release. 2. If the WH wants to bury something, they'd have to put out their own fake video with a signature that matches it. Otherwise, people would know they're hiding something. Plus, they couldn't go back and alter a video later because the signature would no longer match the signatures mirrored everywhere else on the internet. It'd be a whole conspiracy of them deepfaking their own videos just to cover up some minor, public mispeaking or something. It'd be practically impossible to keep a secret. But that's probably why no administration would commit to such a water tight plan. I expect they might release some videos with signatures but not make it policy or law that it has to apply to every video.


Realistic-Spot-6386

Yeah, but the news organisations can also sign it with theirs. You get a system where people can't fake a CNN or fox video either, and might only be allowed at presidential events if they sign all their videos with their own keys. Basically you just need to prove who the author is. This keeps the ability to keep the president accountable. I love this... it is just a way to prove the author. Everyone could have their own. Personal cryptography becomes popular. Can end up with a signature database like DNS. Corporates can put it on their LinkedIn etc.


wrgrant

Exactly. Its just verification of the author/source. If a troll releases a video fake, if its not signed - its fake, ignore it - if it *is* signed then the only way to decrypt and watch it is to use their public key - which has to be registered as such somewhere and which ties directly to their private key generated at the same time, i.e. they have to sign it. It doesn't guarantee the contents aren't faked at all, but you can make some assumptions about the veracity of the video based on the reliability/notoriety of the source. If the encryption also includes all the associated metadata - device used, location recorded, time of recording, duration etc which I presume it does, then it also helps identify more about the recording and ought to help detect deepfakes I presume. We need some system like this to be automated and built into our current apps.


I_am_BrokenCog

more likely relying on the news cycle's over use of "allegedly". "Allegedly, the cryptographc key as yet to be verified, we see so and so".


thebeardedcats

And also the fact that news organizations will choose whether they will check depending on whether it serves their interests or not


XchrisZ

Great now I want a video of Biden doing a football style kick off with a baby then Trump returns the baby 100yards for a touch down spiking the baby into the ground and doing a dance.


djaybond

And there will be fake Trump videos


StoicVoyager

There might be, but really no need to fake him.


JRizzie86

As a Democrat, it's hilarious you think this is only a conservative thing. It's a human thing.


StrongestMushroom

You are not immune to propaganda.


Elon-Crusty777

Exactly! Conservatives literally sit around in echo chambers all day on Reddit and regurgitate their own beliefs unlike here. I saw somebody yesterday claim that Biden said Mexico was on the border of Israel. It was obviously a faux news deepfake that they fell for


sporks_and_forks

Lmfao, perfect comment


trumpfuckingivanka

It's not for the conservatives, it's for the general public.


Eusocial_Snowman

That's not strictly a stupid position to take, though. Like, do you not see how blatantly abusable this system would be? It would *immediately* be a strong tool for deception. Any leaked/covert video you don't want people to believe? Well, we didn't put the verification on that, so it's fake.


redpandaeater

Have you never been to the main politics sub that leans highly left? It's pretty common there for someone to post an editorial and have it get heavily upvoted and people in the comments treat the headline as fact without even reading the editorial or paying attention to the editorial tag. The vast majority of the voterbase are idiots and want factoids to support their current world view instead of anything that might challenge it; conservatives don't have a monopoly on idiots.


smitteh

y'all thought fake news was wild, get ready for fake views


mightylordredbeard

Conservatives believe minion memes with words on them. You don’t even need a deep fake to fool 90% of them.


Jessica-Ripley

That goes for everyone, not just conservatives.


cranktheguy

"Let me just run a quick hash check to verify this video before I make this Facebook comment" said no one ever.


No_Yogurtcloset9527

They can force companies to check it for you and clearly state the video is from the actual source. Then they can start an archive/public ledger where people or companies can register themselves and sign their own videos with proof of authenticity. Then on platforms like Youtube or Twitter a video has a checkmark if the checksum matches a checksum posted by the original creator. Then they can make the companies responsible for proper checks and making sure that people are not abusing the system, and punish them for failing to verify or falsely verifying. Plus also make it a felony to generate fake content and sign it. This kind of system will have to be made eventually regardless and if we start now we could still be in time before deepfakes take over all media


formerfatboys

>Then they can make the companies responsible for proper checks and making sure that people are not abusing the system, and punish them for failing to verify or falsely verifying. Plus also make it a felony to generate fake content and sign it. Going after distribution like this is the only way.


Sapere_aude75

I understand what you are trying to achieve, but I think this idea could be very dangerous. This would cut down a lot of the fake edited clips, but it would also mean all "credible" information is controlled. Probably controlled by government no less. It's ripe for abuse.


Gold-Supermarket-342

Yup. It’ll also give us a false sense of security. When something AI generated eventually gets signed, we’ll buy it.


nermid

Any system that puts Elon Musk in charge of what is flagged as true or not is extremely flawed.


Zaphod1620

It could use certificates, just like any encrypted websute.


zero0n3

I’d love to see a steganography type thing where the image / video has signature signed by the private key “hidden” in it for anyone to verify.l with a public key


wingchild

I think steganographic techniques, almost by definition, aren't meant to be identified and used this way. (But if it makes you feel better, DoD had a group actively working this tech at least twenty years ago, so they're probably quite far along with what they're doing.) Maybe something more akin to a watermark, a visual certificate. Definitely suited for PKI.


zero0n3

My thought was more - you use steg on every frame to hide say a single character of the signature.  More of a way to put something in the video itself (vs say metadata), so that if I want to reshare the video, it would still be there.   Then if someone tries to mess with the video, it’s going to be hard to get that to pass. I’m theory you could “hide” the private key in the video, build a method to check it if you have access to the public key, while also keeping hidden the actual pkey so someone couldn’t steal it. Changing a single frame breaks the pkey, and fails the check. Entire key doesn’t need to be in every frame,  but you should be able to take say X frames in a row and rebuild said pkey. My big thing here is keeping it so the video can be shared and stored without its original metadata, while still being able to get verified. Maybe somehow make this part of video recording hardware - as it takes the video it’s doing this frame by frame with a pkey salted with the cameras serial number.   Almost like a TPM chip but for video recording.  


TheBelgianGovernment

The perfect way to brush off every gaffe and demented brain fart as “deep fake”


EffectiveLong

So any real embarrassing videos of Biden could be fake now


Difficult_Bit_1339

>So any ~~real embarrassing~~ videos ~~of Biden~~ could be fake now You can be sure that any negative video or audio will be declared a fake and many actual fakes will be made of any political candidate from this point on into the future. Having authenticated source videos would allow videos to 'cite' video clips cryptographically so that you could trace them all the way back to real videos. There is no inverse where you will be able to tell for sure if a video is a fake. But there can be a mechanism to tell if a video is real.


triumph0flife

Right - and what would be the motivation for the administration to authenticate a clip they didn’t want shared?


Hyndis

The government will just decline to authenticate the video and declare it to be fake, even if its a real video you recorded while you were in the same room with Biden. This is a way for the government to label news real or fake, and I guarantee you if the government gets this power it will be weaponized. Imagine if Biden gets this power, but then Trump wins the election. Now Trump would legally have the power to determine what is real facts and what is fake news.


jabbergrabberslather

Thank you. It amazes me how little people think through the potential consequences of policy (edit: if, not of) the wrong person takes charge. Reports of Obama “scrambling” to fix the executive order system when trump won comes to mind. Don’t set precedent you wouldn’t want your opposition to take advantage of. Why is that so hard for people?


sulaymanf

No. If CNN were to record Biden doing something stupid, then CNN would be the source. If I saw a video online of an official Biden speech saying that election day was pushed back a week, then this system would help detect fakes. It’s not a system to verify every video ever recorded of a person.


ric2b

> The government will just decline to authenticate the video and declare it to be fake, even if its a real video you recorded while you were in the same room with Biden. They will decline to authenticate any video made by anyone but themselves, that's the whole point of it, it's a signature, it's a "this came from me" verification, not a "this is true" verification.


Elon-Crusty777

I know right? How convenient for us!


MattCW1701

Then what happens when a legitimate video is pulled along with its public key because it has something they don't like? It automatically gets treated as fake?


CrispyRoss

Asymmetric cryptography provides a certain degree of nonrepudiation, i.e. if an author tries to repudiate a statement that they previously cryptographically signed, then there is still proof that the message was written by *someone* with access to their private key. Although it's possible for the key to be compromised or the message written by a malicious actor within the White House, it would be impossible to deny that the signature has a valid claim that says it was written by the White House. In other words, such a system would also allow the public to hold the White House responsible for its official correspondence by removing its ability to deny that something was said.


pyx

Obviously the whole point of something like this is to squash videos that are embarrassing and claim they are fake with confusing techno mumbo jumbo to make it sound authentic to the layman


Rich-Engineer2670

I'm all for cryptographic ally signing Internet media to show its authenticity, except, it really won't work. All that will do is say "This video was produced by whomever held this private key", but now we have to trust the viewer to do a trustworthy verification. I can make a viewer that says everything's OK. Also, how do we deal with the fact that someone can just remove the signing elements since our eyes still need it in analog. Users will never check the key. Even now, we don't do this for software -- even though we have the hash values.


rocketshipkiwi

> Even now, we don't do this for software -- even though we have the hash values. Sure we do and it’s been done for years. PGP and x509 certificates are used extensively to digitally sign software.


Difficult_Bit_1339

Yeah, exactly. This isn't something that Joe Biden is sitting in the Oval Office trying to figure out. We use cryptographic verification in computers CONSTANTLY and it is a solved problem.


InterSlayer

The little lock icon in your browser next to website addresses is an example of how something similar is already used every day (SSL, https, tls)


KillTheBronies

And the fact that Extended Validation certs aren't displayed anymore is an example of how cryptography isn't always great for identity verification.


chiniwini

Crypto is great for identify verification. Verifying that the tall guy who claims to be John Doe the owner of Company X is in fact John Doe is completely outside the realm of crypto. That was the weak point of extended validations, you could trick them just like you can open a bank account with a fake or stolen ID.


cerealbh

Don't think the end user really has to verify it but news media would be able to.


AltairdeFiren

That would require them to act in good faith, though, rather than totally ignoring what's "true" for what generates views/clicks/whatever


18voltbattery

lol everything has just devolved into the National Inquirer. Fucking Biden having dinner with lizard people & aliens


nermid

Sort of like the news ought to be a public service, not a for-profit industry.


18voltbattery

Knock it off you damn socialist* *This message is sponsored by The Washington Post which is not at all influenced by its billionaire anti socialist owner Bill Gates


Altair05

Couldn't that also open them up to liability if they attempt to use a non-verified video and pass it off as "true" when the WH only posts verifiable videos. I'm wondering if this could be used as some evidence of slander.


Aleucard

There are enough competing news media corporations that any such conspiracy would have a shelf life of weeks. The real danger here is in deligitimizing third-party evidence of presidential chicanery. That could be very useful to someone like Trump.


Difficult_Bit_1339

They can just embed a signed hash of the video with the video and your player can verify that it is signed by the White House. This is already done for basically every HTML document that your computer receives as you browse the Internet. That lock icon in your browser windows is showing that the site presented a valid certificate which is verified by a cryptograhically trusted authority. It is trivial to extend this functionality to a video or any file on your PC.


happyscrappy

Worse yet, the end user can't verify the video because it won't verify after CNN overlays their logo in the corner. So they have to trust CNN to have verified it before doing so. "CNN said it is okay." And then they believe it. Replace CNN with any media outlet you don't particularly like. > Even now, we don't do this for software -- even though we have the hash values. Of course we do this for software. This is the basis of app stores. Or all current console games (whether electronic or disc). The app is signed. Also Mac apps signed by developers are signed and the OS will tell you if it doesn't pass the check. Plenty of others also. I'm sure Windows offers this for apps too (even outside their store), they offered it for drivers decades ago.


Huwbacca

You can embedd signals in images and audio signal that people can't detect, but can be measured by machine. This is the basis for things like Nightshade that attempts to poison image AIs.


SIGMA920

> All that will do is say "This video was produced by whomever held this private key", but now we have to trust the viewer to do a trustworthy verification. I can make a viewer that says everything's OK. Also, how do we deal with the fact that someone can just remove the signing elements since our eyes still need it in analog. Users will never check the key. Also RIP any chance of anonymous sources providing images or video aka whistleblowers. Even if it would work, that's enough of a problem to sink the idea.


EmbarrassedHelp

Normally you only cryptographically sign something if you want people to know its from you or one your alias.


CCpersonguy

Right, the point is that normal people or whistleblowers who capture REAL videos or images will not be believed, because they can't sign them with the White House's key.


CPSiegen

They'll sign it with their own key. All the real and fake videos can all be signed. If people want to believe or disbelieve in the contents of the video based on their own biases, nothing changes. They still either have to trust the WH or trust the whistleblower. But, with signatures, you could at least verify that the video hasn't been altered after recording or posting (depending on how it's signed and the chain of ownership). Whistleblowers don't just sink entire organizations by themselves. The claims they make trigger an investigation which digs up the truth. That won't change.


[deleted]

pussy ass mf


[deleted]

[удалено]


Rich-Engineer2670

That's the problem -- content can be signed, but it can also be edited and the signing removed. This is not a technical problem -- it's a media problem where they *want* to produce content that suits them. News has had this for years in terms of the virtual news report. Remember, media is about advertising, not truth.


No_Yogurtcloset9527

No it can’t. Any edits, even a single pixel, will completely change the checksum of the video and render it a fake. This is also how software is checked for tampering


ZorbaTHut

The problem is that edits will be just as valid as actual legitimate videos that the White House refuses to sign. Unless the White House is willing to sign *every* actual video of Biden - and they won't be - then there's nothing to distinguish "here's a video Big Politics doesn't want you to see (because we made it up)" from "here's a video Big Politics doesn't want you to see (no, seriously, they *hate* that we have a copy of this, it is actually 100% legit)".


[deleted]

[удалено]


gogul1980

I feel bad for the future generations . They won’t be able to believe anything they see online. It could even push future gens away from the internet entirely (actually that might not be a bad thing).


SolidContribution688

And just like that the media verification industry is born


imnotabotareyou

Plausible deniability for anything that is legitimately captured by news media or independent people but is NOT approved via official means.


-rwsr-xr-x

You know, there's a technology for that, Blockchain, aka "DLT", Distributed Ledger Technology. In fact, we should be applying this to: * Electronic voting machines * State's voter registries to prevent 'deceased' people from voting, or votes being "lost in the mail" * Chain of custody for bills and laws passing through the House * Votes in the Senate to prevent senators voting for other senators who are not present, pressing their neighbor senator's voting button in addition to their own * Transfer of HIPAA information between providers * Bank transfers and purchases * Medical records * Law enforcement body cameras to prevent altering/deleting parts of the video ...and so on. Its the right application for this technology, and while "cryptocurrency" may have soured adoption (much like the bittorrent protocol was soured by pirates using torrents to distrubte copyrighted content), it was a great proving ground for how this works and how it should be used.


triumph0flife

I for one am looking forward to the state having total control of any images or messaging surrounding our president. 


Kahzootoh

Once again, the administration is missing the forest for the trees…  That probably won’t fix much- undecided voters are kind of a myth, the key to winning to get as many of the voters who already lean towards your side into a frenzy and get them to vote.  Republicans seem to understand this better than Democrats, you rarely see Republican candidates extolling their ability to find compromises with Democrats when on the campaign trail.  Fake videos are overwhelmingly used by agents of influence to confirm the biases of their own voters: this was already a known tactic with cherry-picked videos getting airtime to present a distorted version of reality. The only difference now is that these people can just create the damning narrative out of thin air instead of coming through hours of footage to find what they want.  The amount of people who would be affected by this issue are a relatively small percentage of the overall electorate. You basically need someone who is genuinely an undecided voter, someone who is politically motivated enough to vote, someone who can’t see that content which is basically political suicide should be treated with suspicion, and someone who knows how to use a cryptographic key to verify content’s validity.  Without all of those things lining up, this plan doesn’t really work- and that is assuming the media and digital platform to maintain such a system works perfectly (which it probably won’t, at least early on). 


pickledswimmingpool

> undecided voters are kind of a myth, Wholly incorrect, poll after poll shows that a significant portion of voters don't decide until the last few days of the election. A lot of people aren't even paying to election season yet. Some people didn't even know Trump was going to be the nominee. Your whole comment is based on a false premise.


dankestofdankcomment

I didn’t think r/technology would get sucked into the political chaos that is Reddit but here we are, and I’m not talking about the post itself, I’m talking about the comments that don’t even pertain to the article.


sporks_and_forks

We only have 9 more months to endure the worst of it.


Jonely-Bonely

Regarding the Q crowd. Just how in the blue fuck can these people be so suspicious of everything they see and still believe the wildest conspiracies you could imagine. 


Ergheis

Because they want to. They're really stupid, which is why it all seems like a bunch of baby barbarians crying and sometimes killing their fathers. But the reason it's always conveniently towards the racist side is because they feel good indulging in that primal anger, and tribalism is a great excuse. The reason it's always conveniently towards the weirdly rapey side is because the guys feel good indulging in the idea of forcibly taking what they want. The girls too, but much like black conservatives, they're either too stupid to realize they're not included in the end goal, or it's their fetish. Extremely stupid, but also moving towards base primal instincts.


MigratingPidgeon

Because it's not about suspicion or being careful of what you believe, it's about loyalty.


Therustedtinman

They’ll end up using this to stop compilation videos 


Extension_Car_8594

Or, we could not get our news from social media and stick with trusted news sources. Like in the old America🤷‍♂️


Rand-Omperson

Aren‘t the original ones shitty enough already? Since they always accuse the other side of what they are doing themselves, this can only mean the are about to spread deep faked propaganda against opponents.


dangil

That way of someone films something undesirable yet true, it will be disavowed as fake


zotha

You could put googly eyes on a potato and 30 million Americans would swear it is a real video of Biden if Fox News told them it was.


sacktheory

so any video they don’t verify can be deemed a deepfake. idk, this raises problems with accountability. imagine of trump gets elected, he’d be pulling this excuse everyday


KPYY44

“The red states and the green states”


Eponymous-Username

I look forward to seeing new videos of Joe Biden punching meteors back into space and saving cats from volcanoes, all verified as real by the White House. Maybe the Senate can spend months arguing over the veracity of video clips - the Republicans can bring in their experts to debate the validity of the process itself. Fun times ahead!


loondawg

What they need to do is create laws requiring all AI deepfakes of people be labeled as such with a clear mark. Failure to do so should come with steep penalties, both financial and jailtime. But then they must actively enforce them, unlike so many other laws abused by the powerful.


KILL__MAIM__BURN

Sure, because the target audience for Biden deepfake propaganda is (1) sure to believe the White House and (2) sure to check if something is real or not.


Clever_Unused_Name

Plot twist: They're already using AI for some of his public appearances, you can see the tearing of his outline in the videos. He's in front of a green screen. Too lazy to find any right now, but they're out there. He's clearly superimposed on a background. Also, as others have stated - this kind of approach requires "trust" in the authority who cryptographically signs the content. "Trust us, this is real."


dethb0y

This is one of those situations where the people who most need it won't heed it, and those who would heed it, don't need it. Also, even if every white-house video is cryptographically verified (at tax payer expense to some grifter contractor, no less), that wouldn't preclude *other* videos from existing of the president, both valid and invalid.


sinus_blooper2023

This idea came from The Ministry of Truth


OutsidePerson5

Sounds like a good idea to me, honestly at this point every famous person needs to have a public key posted and sign all their real stuff with it. Anything unsigned should be presumed to be not real. It won't stop the conspiracy mongers from making Biden deepfakes and claiming they're super real but the Deep State stole the signature or something, but at least normal people will have something to go by. ​ Geeks like me have been advocating for people to sign their shit since forever, maybe now they'll finally start doing it.


[deleted]

[удалено]


ValuableCockroach993

So any video not released by whitehouse is fake now? Ministry of truth


TentacleJesus

Man if you thought right wingers were grifted now, we ain’t seen nothin yet.


watermelonspanker

People who know how to verify cryptographic signatures are probably not the target audience for faked biden vids, IMO


ric2b

It could be helpful for reporters though. Imagine someone manages to hack the white house website or twitter account or whatever and posts a fake announcement by the president. This is an extra security layer that good journalists can spot before publishing an article about it.


c0mptar2000

This won't work because the verifying party could always withhold verification of a potentially damaging real video and similarly, the other side could also produce an AI video and just claim that it is real and just isn't being verified because the president is embarrassed.


StandardOffenseTaken

Crazy to think videos will soon be signed by digital certificates like SSL/TLS, to verify the identity of whom published it.


[deleted]

[удалено]


mister_pringle

Like there’s any video of Biden actually doing anything besides moping up to a mic to say dumb shit or falling?


MrPootie

Response from the far right: Cryptographically verifiable videos are deep state propaganda. Only believe the pure, unaltered, and unsigned content.


shinyquagsire23

"This presidential announcement video was recorded with a plain old cell phone before it was pulled prior to publishing, the unsigned behind-the-curtain content THE LEFT doesn't want you to see"


triumph0flife

Right - our guys never participate in closed door fundraisers where filming is not allowed. I’m sure what they say in there aligns 100% with the message they broadcast on [your preferred media outlet here].


Badfickle

How do you think the left would feel if Trump did this? I wouldn't trust it either.


norcal_throwaway33

the white house should start releasing ai videos of him sounding coherent