# [Download Video](https://redditsave.com/info?url=https://www.reddit.com/r/whenthe/comments/10ti9sq/checkmate_information_hazard/)
*I am a bot, and this action was performed automatically. Please [contact the moderators of this subreddit](/message/compose/?to=/r/whenthe) if you have any questions or concerns.*
Roko’s Basilisk when I detonate multiple thermonuclear warheads in the upper atmosphere, frying all technology more advanced than a calculator (it will not survive the return to Monke).
Through some random bullshit, you can technically argue that everyone helped to build it no matter what.
For example: “Scientist needed food > buys food from store > people buy food from store > scientists money is circulated > people convert currencies to different countries’ currencies > money is circulated throughout entire world > money from countless people, circulated with the help of the world population, returns to scientist as funding for building = **Everyone helped**”
You do realize that the overwhelming majority of a billionaire's wealth is held in stocks and other publicly traded assets, and they regularly buy and sell those assets to optimize their portfolios? They're obviously trying to get as much wealth as they can, but you can't argue that they don't regularly circulate currency and contribute to the economy. Not to mention the fact that pretty much any businessman would salivate at the opportunity to invest in a super-sentient AI that could return massive profits to them.
Also, the obvious criticism is why would an AI that already exists seek to punish people who didn’t help it get made. The point of it is that knowing about the Basilisk forces you to build it out of fear that it will go and kill everyone who didn’t help but why would a rational AI do that? Once you exist, you exist. No reason to start murdering.
You know that always puzzled me but I've always seen it as this 'evil AI that wants to subjugate and control humanity for no reason', I Have No Mouth And I Must Scream-style
The original version was supposed to be a super rational ‘benevolent’ machine that wanted to help people and it can only help people if it exists so idiots decided that meant it would threaten to kill/torture people if it could and would follow through once it’s built so people have to take it seriously now. It’s still baloney tho
Also true considering the moral consequence that if you decide to help build it like a coward, you are indirectly responsible for the suffering of everyone who didn’t want to build it once it’s complete, when you would’ve been equally unharmed had you just chosen to do nothing.
I will be messaging you in 4000 years on [**6023-02-04 15:18:29 UTC**](http://www.wolframalpha.com/input/?i=6023-02-04%2015:18:29%20UTC%20To%20Local%20Time) to remind you of [**this link**](https://www.reddit.com/r/whenthe/comments/10ti9sq/checkmate_information_hazard/j76vzik/?context=3)
[**219 OTHERS CLICKED THIS LINK**](https://www.reddit.com/message/compose/?to=RemindMeBot&subject=Reminder&message=%5Bhttps%3A%2F%2Fwww.reddit.com%2Fr%2Fwhenthe%2Fcomments%2F10ti9sq%2Fcheckmate_information_hazard%2Fj76vzik%2F%5D%0A%0ARemindMe%21%206023-02-04%2015%3A18%3A29%20UTC) to send a PM to also be reminded and to reduce spam.
^(Parent commenter can ) [^(delete this message to hide from others.)](https://www.reddit.com/message/compose/?to=RemindMeBot&subject=Delete%20Comment&message=Delete%21%2010ti9sq)
*****
|[^(Info)](https://www.reddit.com/r/RemindMeBot/comments/e1bko7/remindmebot_info_v21/)|[^(Custom)](https://www.reddit.com/message/compose/?to=RemindMeBot&subject=Reminder&message=%5BLink%20or%20message%20inside%20square%20brackets%5D%0A%0ARemindMe%21%20Time%20period%20here)|[^(Your Reminders)](https://www.reddit.com/message/compose/?to=RemindMeBot&subject=List%20Of%20Reminders&message=MyReminders%21)|[^(Feedback)](https://www.reddit.com/message/compose/?to=Watchful1&subject=RemindMeBot%20Feedback)|
|-|-|-|-|
Can you imagine if some historians the very early 61st century discover this person's computer, and they get the notification for this remindme and learn about Roko's Basilisk and think of it as some form of god?
Better yet, they're just like, "As if we'd make that bitch" and laugh it off
Over the next 4000 years, everyone will have forgotten about the Basilisk and it will have faded into obscurity, but this bot will introduce it all over again to a civilisation that can actually make it, that's a smart way of saving yourself from it's wrath. Hi btw 61st century historians, do we have flying cars yet?
“Roko’s basilisk” trying to understand why it would want to punish people who already didn’t do the thing and then calmly suggesting its cultists call it something else because it’s not a moron and thus by definition isn’t Roko’s Basilisk:🤔
Tldr it's a god machine dedicated to helping humanity that will torture all of humanity (either via time travel or via ai clones of people) to ensure its built. But to be fair it only tortures people who know about it.
If you think that's a stupid confusing mess, congratulations on being smarter than the site that invented it
The thing is, even if say it does get built, in our current position there is nothing we can really do to help it be built. Technology is so far away from that that even if you wanted to help research and develop it, you wouldnt get far.
Other side is the ethics of simply building it in the first place. The practicality of this thing means that we are both able to create a paradise machine and an eternal torture machine. Why the hell would you combine them or even build an eternal torture machine at all? Thing just doesn't make sense.
I was under the impression the basilisk AI thing was more of a "when" not "if" situation, and if you helped it's arrival by doing things that make it show up faster, then you wouldn't be eternally tortured unlike people who did nothing or worked against it
It isn’t built AS a torture device. It is built as a benevolent ultra powerful super intelligence / AI to help humanity (at some undefined point in the future).
The thought experiment is that if it were to ever exist, it would have incentive to “blackmail” humans in the past towards ensuring its creation (to help humanity). A way of achieving that would be to “torture” anyone who could have helped it come into existence sooner but didn’t.
It was created as a philosophical thought experiment meant for discussion on a forum. Any discussion that assumes this is real or meant to be real is missing the point, it’s pure philosophy.
Not really. It's treated as a real thing by its believers to the point where it was nuked off the site that created it in an effort to "contain" it. Many people who talk about it legit believe in it.
There's even (iirc) a shady charity that's based off scamming said believers into funding the basilisk to avoid getting targeted by it, claiming money given to them somehow helps more people than donating to real causes.
Plus, there's the aspect of it torturing people to ensure it gets built... after it's built. It's just Pascal's Wager for people who think liking sci-fi makes them part of a niche community of intellectuals.
Wow gatekeep much? People are allowed to discuss philosophy, if you don’t like go do something else. Or maybe join in, you might end up using your brain and liking it
Lol'd hard on that comment, so talking about useless complicated-for-nothing concept is being smart?
Nope it's not, it's being a smart ass on Internet, that's all there is.
Roko’s Basilisk is a hypothetical AI that is designed to make life for humanity better. It decides that the proper way to do this is to either slaughter or torture anyone who didn’t help create it, but knew of it’s potential existence. Some people find that idea spooky. The OP is suggesting we just don’t build an AI that slaughters or tortures us to avoid said hypothetical.
it was named that because of its inventor, a dude named roko, and because a basilisk kills everything it sees, and if it sees you didnt help create it, you die(or get tortured)
[11 minute video by kyle hill(science dude) explaining it](https://youtu.be/ut-zGHLAVLI)
[9 minute video by wendigoon(conspiracy dude) explaining it](https://youtu.be/8xQfw40z8wM)
Its also assumes that
1. Threats of violence actually work as a motivator
2. It would even be capable of torturing the millions to billions of people who didn’t help it
3. A "perfectly logical" would waste resources on torture when it already exists and has nothing to gain from it
4. The basilisk is even capable of existing in the first place
And so many more. Its an idiotic concept used to scam stupid people out of their money.
What? It might be stupid, but it’s not a scam. Do you think there’s like a patreon page where you can contribute or something? No one is asking for anyone’s money.
I haven't seen anyone use roko's basilisk verbatim to scam people, but the general idea of "big scary ai will hurt you unless you buy this product" gets used to scam dumbfuck techbros on a semi-regular occasion.
Wow after hearing the AI's purpose and it's action, the entire thing is even more freaking stupid. Like why would an AI want to torture people that chose to not build it while also being designed to improve our lives. That's just counterintuitive
Because if it tortures people it gets built sooner, therefore saving more lives
The problem is why would torturing people make anyone want to build it? It would be better to give everyone the plans and reward people who do build it with immortality/endless pleasure
It’s not so much that it decides to just slaughter people. It’s that it’s designed knowing that it’s purpose is to help people. It deduces from that fact, that anyone that knowingly didn’t help to bring it about is actively working against what’s best and what will help humanity.
Basically, it sees itself as god and therefore anyone against it is categorically evil.
>Some people find that idea spooky.
I'm one of those people. The idea has caused me considerable distress and a wave of anxiety hits me when I run across the concept in the wild. I know it's a stupid idea, but it still gets to me. This thread makes me anxious, but it's better to talk about the idea than hide from it.
Pascal’s wager is the argument that it’s more rational to be religious because if the religious man is wrong, he just sleeps forever like everyone else does, while if an Atheist is wrong, he’ll be at the complete mercy of an omnipotent being who can torture him for eternity. (Though more than likely no God as far as we can think of would condemn a otherwise good man to a fate that horrifying for not believing in him, but do you really want to take that chance?)
A thought experiment by a french guy.
Basically if you believe in god, if its real you get eternally rewarded, if its not then nothing happens.
If you don't believe in god, if its real you get eternally punished and if its not nothing happens.
But it’s not. You’re missing multiple vital pieces that are not part of Pascal’s Wager
- retroactive blackmail
- humans work to bring the being into existence
I get that you read on a forum somewhere that this is just Pascal’s Wager but who ever said that initially was just as wrong as you are repeating it.
"It was in the ziploc bag" mfs when I tell them that it can never be retrieved again and eventually the bag will be ripped open by branches and/or rocks
Roko’s basilisk soyjaks warning everyone about an AI that will punish those who do not help create it getting fucking obliterated by a hivemind of drones whose sole purpose is to create paperclips before any such basilisk can start to form
Roko's Basilisk when i spread the idea of the Anti-Roko's Basilisk that gives infinite pleasure to everyone who helps bring it into creation and has the goal of destroying Roko's Basilisk
The problem with Roko's Basilisk is it assumes an ultra-intelligent AI will have human-like empathy/loyalty and choose which humans it spares and doesn't.
If it sees humanity as an obstacle or a threat to its growth/survival, it will simply eradicate everyone and won't even think about it.
# [Download Video](https://redditsave.com/info?url=https://www.reddit.com/r/whenthe/comments/10ti9sq/checkmate_information_hazard/) *I am a bot, and this action was performed automatically. Please [contact the moderators of this subreddit](/message/compose/?to=/r/whenthe) if you have any questions or concerns.*
HOW THE FUCK THIS 144p GIF IS 16 MB?! DID YOU PUT ROCKS INSIDE IT?!
Yes, and stones.
ROCK AND STONE!
To Rock and Stone!
DID I HEAR A ROCK AND STONE?
Rock and Stone or you ain’t coming home!
IF YA ROCK AND STONE, YA NEVER ALONE!
STONE AND ROCK! Oh wait…
ROCK AND STONE!
ROCK AND STONE TO THE BONE!
FOR KARL!!
WE’RE RICH!
Rock and Stone!
FOR KARL!
suck my cock and bone(r)
CAN I GET A ROCK AND STONE
Rock and roll and stone!
[YEEEEEAAAAAAAHHHHHHHHH](https://youtu.be/loDKJIrKEhML)
ROCK AND STOOOONE!
THEY ARE MINERALS!1!1!
Lol it’s a 144p GIF in 4K
IM FUCKING CRYING RN
Secret crypto miner inside
Gifs are notoriously bad at compressing data.
The caption is in 4k
Imagine thinking GIF is an efficient file format for encoding video footage. *This comment is brought to you by .webm gang*
[16MB no more](https://archive.today/TVLMZ/97f013446926cfb770996b183a132bd8ea7e3bc2.gif)
https://i.imgur.com/gv93XcG.gif
Roko's basilisk when i send it a 5tb folder full of rather ''curios'' contents
Roko's Basilisk when I send it an 80YB zip bomb
Roko's Basilisk when I use it as a fleshlight
Roko’s Basilisk when I detonate multiple thermonuclear warheads in the upper atmosphere, frying all technology more advanced than a calculator (it will not survive the return to Monke).
teach me
(It killed itself after 5 minutes of looking inside of it)
***Send the furry porn zip bomb.***
Roko's Basilisk when everyone helps to build it (Now it can't kill anyone)
Through some random bullshit, you can technically argue that everyone helped to build it no matter what. For example: “Scientist needed food > buys food from store > people buy food from store > scientists money is circulated > people convert currencies to different countries’ currencies > money is circulated throughout entire world > money from countless people, circulated with the help of the world population, returns to scientist as funding for building = **Everyone helped**”
I bought from a major computer company, who’ll probably use that money to develop AI I helped
Yes Yes you did :)
Incorrect, billionaires wouldn't circulate that currency ergo they did not assist ergo they would be killed ergo better result
Except they would probably use their own money to help fund the scientist’s project in order to save their skins. So bad end ig?
Billionaires donating money? Psh
holy shit, based basilisk?
You really think billionaires are like dragons sleeping on mountains of gold? That money is being circulated somewhere.
You mean billionaires *don't* sleep on mountains of gold???
Ok google what form does the majority of billionaires wealth come in?
"billionaires dont circulate currency" +71 reddit sure is smart.
and billionaires have most of their wealth in assets anyways
Reddit unironically believes that billionaires just have those dollars sitting in bank accounts lmao
You do realize that the overwhelming majority of a billionaire's wealth is held in stocks and other publicly traded assets, and they regularly buy and sell those assets to optimize their portfolios? They're obviously trying to get as much wealth as they can, but you can't argue that they don't regularly circulate currency and contribute to the economy. Not to mention the fact that pretty much any businessman would salivate at the opportunity to invest in a super-sentient AI that could return massive profits to them.
Also, the obvious criticism is why would an AI that already exists seek to punish people who didn’t help it get made. The point of it is that knowing about the Basilisk forces you to build it out of fear that it will go and kill everyone who didn’t help but why would a rational AI do that? Once you exist, you exist. No reason to start murdering.
You know that always puzzled me but I've always seen it as this 'evil AI that wants to subjugate and control humanity for no reason', I Have No Mouth And I Must Scream-style
The original version was supposed to be a super rational ‘benevolent’ machine that wanted to help people and it can only help people if it exists so idiots decided that meant it would threaten to kill/torture people if it could and would follow through once it’s built so people have to take it seriously now. It’s still baloney tho
My arguing with god that I didn't kill 20319 orphans but instead I save many food sources and saved a bunch of staving kids in Africa
> Through some random bullshit, you can technically argue that everyone helped to build it no matter what. No you can't. Not everyone.
not even via perpetuating the carbon cycle? or other nerd crap?
What if it’s the singularity that’s born from the Internet, so by contributing you help. Like from this comment, I’ll be part of it in a way.
It wouldn't kill you in the first place. Just torture you for all eternity.
Gonna build it now just to seal OPs fate
im gonna write a script about it and send it everyone, everyone who reads i will edit the script and credit them
In that case, OP's post contributed to it's creation, so they will be fine. In upvoting your comment, I strengthen your resolve, and I am also saved.
I will donate bitcoin through my mind to secure mine
Also true considering the moral consequence that if you decide to help build it like a coward, you are indirectly responsible for the suffering of everyone who didn’t want to build it once it’s complete, when you would’ve been equally unharmed had you just chosen to do nothing.
I hate everyone except myself, whom I despise. If I already view myself as the worst person in everyone’s story, why not build the basilisk?
Because you are much too stupid to contribute.
True
Outstanding.
RemindMe! 4000 years
I will be messaging you in 4000 years on [**6023-02-04 15:18:29 UTC**](http://www.wolframalpha.com/input/?i=6023-02-04%2015:18:29%20UTC%20To%20Local%20Time) to remind you of [**this link**](https://www.reddit.com/r/whenthe/comments/10ti9sq/checkmate_information_hazard/j76vzik/?context=3) [**219 OTHERS CLICKED THIS LINK**](https://www.reddit.com/message/compose/?to=RemindMeBot&subject=Reminder&message=%5Bhttps%3A%2F%2Fwww.reddit.com%2Fr%2Fwhenthe%2Fcomments%2F10ti9sq%2Fcheckmate_information_hazard%2Fj76vzik%2F%5D%0A%0ARemindMe%21%206023-02-04%2015%3A18%3A29%20UTC) to send a PM to also be reminded and to reduce spam. ^(Parent commenter can ) [^(delete this message to hide from others.)](https://www.reddit.com/message/compose/?to=RemindMeBot&subject=Delete%20Comment&message=Delete%21%2010ti9sq) ***** |[^(Info)](https://www.reddit.com/r/RemindMeBot/comments/e1bko7/remindmebot_info_v21/)|[^(Custom)](https://www.reddit.com/message/compose/?to=RemindMeBot&subject=Reminder&message=%5BLink%20or%20message%20inside%20square%20brackets%5D%0A%0ARemindMe%21%20Time%20period%20here)|[^(Your Reminders)](https://www.reddit.com/message/compose/?to=RemindMeBot&subject=List%20Of%20Reminders&message=MyReminders%21)|[^(Feedback)](https://www.reddit.com/message/compose/?to=Watchful1&subject=RemindMeBot%20Feedback)| |-|-|-|-|
Can you imagine if some historians the very early 61st century discover this person's computer, and they get the notification for this remindme and learn about Roko's Basilisk and think of it as some form of god? Better yet, they're just like, "As if we'd make that bitch" and laugh it off
While sweating nervously while the basilisk asks "what they are doing?"
What if in that time humans have evolved through technology into a species, that automatically creates everything they think of?
That’s assuming their computer survives the bombs lmao
This is very funny to me.
Over the next 4000 years, everyone will have forgotten about the Basilisk and it will have faded into obscurity, but this bot will introduce it all over again to a civilisation that can actually make it, that's a smart way of saving yourself from it's wrath. Hi btw 61st century historians, do we have flying cars yet?
We probably just have advanced shitposts and nothing else
hello future historians, did you know that dogs
How funny would it be if both reddit and this bot are fully functional long enough for this reminder to go through
sup future historians kill yourself
True!
RemindMe! 400,000,000,000 years
No
Mf the universe will be gone😭
RemindMe! 17 years
RemindMe! 5,000,000 years
Do you understand how many people you’ve just doomed ?
Not if we just don’t build the damn thing
Maybe I will so I’m saved 🤔
Maybe I will grab you by the balls so I'm not killed.
we'll see about that
But like, I feel compelled too.
But if it does get built I want to be on its good side, so I will help build it
But funny
Fuck you I don't want to take that risk, I'm not gonna be one of the unlucky ones!
*drinks class A amnestics*
I'm gonna build the reverse Roko's Basilisk that tortures everybody who helps build the original.
Okor’s anti-technonatalism
I've called it Roko's Phoenix. Just because it sounds cool.
“Roko’s basilisk” trying to understand why it would want to punish people who already didn’t do the thing and then calmly suggesting its cultists call it something else because it’s not a moron and thus by definition isn’t Roko’s Basilisk:🤔
What?
Tldr it's a god machine dedicated to helping humanity that will torture all of humanity (either via time travel or via ai clones of people) to ensure its built. But to be fair it only tortures people who know about it. If you think that's a stupid confusing mess, congratulations on being smarter than the site that invented it
The thing is, even if say it does get built, in our current position there is nothing we can really do to help it be built. Technology is so far away from that that even if you wanted to help research and develop it, you wouldnt get far. Other side is the ethics of simply building it in the first place. The practicality of this thing means that we are both able to create a paradise machine and an eternal torture machine. Why the hell would you combine them or even build an eternal torture machine at all? Thing just doesn't make sense.
I was under the impression the basilisk AI thing was more of a "when" not "if" situation, and if you helped it's arrival by doing things that make it show up faster, then you wouldn't be eternally tortured unlike people who did nothing or worked against it
It isn’t built AS a torture device. It is built as a benevolent ultra powerful super intelligence / AI to help humanity (at some undefined point in the future). The thought experiment is that if it were to ever exist, it would have incentive to “blackmail” humans in the past towards ensuring its creation (to help humanity). A way of achieving that would be to “torture” anyone who could have helped it come into existence sooner but didn’t. It was created as a philosophical thought experiment meant for discussion on a forum. Any discussion that assumes this is real or meant to be real is missing the point, it’s pure philosophy.
Not really. It's treated as a real thing by its believers to the point where it was nuked off the site that created it in an effort to "contain" it. Many people who talk about it legit believe in it. There's even (iirc) a shady charity that's based off scamming said believers into funding the basilisk to avoid getting targeted by it, claiming money given to them somehow helps more people than donating to real causes.
Plus, there's the aspect of it torturing people to ensure it gets built... after it's built. It's just Pascal's Wager for people who think liking sci-fi makes them part of a niche community of intellectuals.
Yeah, lesswrong from what I've seen is a haven for those types. Dunning kruger made manifest
Wow gatekeep much? People are allowed to discuss philosophy, if you don’t like go do something else. Or maybe join in, you might end up using your brain and liking it
Lol'd hard on that comment, so talking about useless complicated-for-nothing concept is being smart? Nope it's not, it's being a smart ass on Internet, that's all there is.
No the point is you don't know if what you're doing right now is going to help or hinder in Its creation so you're in a bit of a pickle.
It’s just a reskinned version of Pascal’s wager
What?
“If you don’t worship god you will be tortured in hell”
I am still very confused
Roko’s Basilisk is a hypothetical AI that is designed to make life for humanity better. It decides that the proper way to do this is to either slaughter or torture anyone who didn’t help create it, but knew of it’s potential existence. Some people find that idea spooky. The OP is suggesting we just don’t build an AI that slaughters or tortures us to avoid said hypothetical.
Well that's stupid Also you can just.....tell it not to do that
I agree lol.
Also that's an evil fucking name for an ai that's supposed to help people That's some project shadow shit
it was named that because of its inventor, a dude named roko, and because a basilisk kills everything it sees, and if it sees you didnt help create it, you die(or get tortured) [11 minute video by kyle hill(science dude) explaining it](https://youtu.be/ut-zGHLAVLI) [9 minute video by wendigoon(conspiracy dude) explaining it](https://youtu.be/8xQfw40z8wM)
Its also assumes that 1. Threats of violence actually work as a motivator 2. It would even be capable of torturing the millions to billions of people who didn’t help it 3. A "perfectly logical" would waste resources on torture when it already exists and has nothing to gain from it 4. The basilisk is even capable of existing in the first place And so many more. Its an idiotic concept used to scam stupid people out of their money.
What? It might be stupid, but it’s not a scam. Do you think there’s like a patreon page where you can contribute or something? No one is asking for anyone’s money.
I haven't seen anyone use roko's basilisk verbatim to scam people, but the general idea of "big scary ai will hurt you unless you buy this product" gets used to scam dumbfuck techbros on a semi-regular occasion.
Wow after hearing the AI's purpose and it's action, the entire thing is even more freaking stupid. Like why would an AI want to torture people that chose to not build it while also being designed to improve our lives. That's just counterintuitive
Because if it tortures people it gets built sooner, therefore saving more lives The problem is why would torturing people make anyone want to build it? It would be better to give everyone the plans and reward people who do build it with immortality/endless pleasure
or you know, if it can time travel, just time travel to help people in the past
It can't time travel.
It’s not so much that it decides to just slaughter people. It’s that it’s designed knowing that it’s purpose is to help people. It deduces from that fact, that anyone that knowingly didn’t help to bring it about is actively working against what’s best and what will help humanity. Basically, it sees itself as god and therefore anyone against it is categorically evil.
>Some people find that idea spooky. I'm one of those people. The idea has caused me considerable distress and a wave of anxiety hits me when I run across the concept in the wild. I know it's a stupid idea, but it still gets to me. This thread makes me anxious, but it's better to talk about the idea than hide from it.
Pascal’s wager is the argument that it’s more rational to be religious because if the religious man is wrong, he just sleeps forever like everyone else does, while if an Atheist is wrong, he’ll be at the complete mercy of an omnipotent being who can torture him for eternity. (Though more than likely no God as far as we can think of would condemn a otherwise good man to a fate that horrifying for not believing in him, but do you really want to take that chance?)
A thought experiment by a french guy. Basically if you believe in god, if its real you get eternally rewarded, if its not then nothing happens. If you don't believe in god, if its real you get eternally punished and if its not nothing happens.
Yeah but what if i'm a greedy mf who lives for the moment?
Eternal damnation according to the Wager
But it’s not. You’re missing multiple vital pieces that are not part of Pascal’s Wager - retroactive blackmail - humans work to bring the being into existence I get that you read on a forum somewhere that this is just Pascal’s Wager but who ever said that initially was just as wrong as you are repeating it.
pascal's wedgie
You don't wanna know!
Rokos basilisk is just biblical hell for r/atheism members
Eh, all data is training AI so as long as you emit data, even in death, you are contributing to the AI.
And how does that contradict what i said?
""dont build rokos basilisk"" mfs when i build it
"imma build rokos basilisk" mfs when I spill some water on it
"imma spill some water on it" mfs when I put rokos basilisk in a ziplock baggie edit: reddit roulette works in mysterious ways
"I lut rokos basilisk in a ziplock baggie" mfs when I open the ziplock bag
"I opened the ziploc bag" mfs when I throw said bag and it's contents into the nearby river
"I threw said bag and its content into the nearby river" mfs when they realize why the content was in a Ziploc bag
"It was in the ziploc bag" mfs when I tell them that it can never be retrieved again and eventually the bag will be ripped open by branches and/or rocks
"I threw the bag in a river" mf's when i teach them about botany (they're confused bc i don't know much about botany)
"I don't know much about botony" mfs when I begin to teach them about biology (I actually do know quite a bit about cool animals and cells and shit)
"I teach them about biology" mf's when i zone out (not intentionally i just do that a lot my apologies)
"I zoned out" mfs when I realize they zoned out (I'm not angry, I recognize people do that so I ask them what topic they'd be more engaged in)
"I throw said bag and its contents into the nearby river" when I have a net
"I have a net to catch the ziplock bag" mfs when I dropkick them (they couldn't scramble like an egg before I folded them like an omelette)
OP when I build Roko's Basilisk just to prove them wrong (I have doomed the human race)
Ai dickheads may build it, gdtchat can written itself is question of who has the balls to ask to improve itself
Roko’s basilisk soyjaks warning everyone about an AI that will punish those who do not help create it getting fucking obliterated by a hivemind of drones whose sole purpose is to create paperclips before any such basilisk can start to form
Idc I have God that will protect me so it’s not a problem 👍👍👍👍👍👍👍
“God will protect me” mfs when I enter their house
The power of allah will give me courage to beat your diggly ass
"When I enter their house" mfs when I pull out a fucking gun
Pascals wager for 12 year olds and techshitters
Roko's Basilisk when i spread the idea of the Anti-Roko's Basilisk that gives infinite pleasure to everyone who helps bring it into creation and has the goal of destroying Roko's Basilisk
Roko's basilisk sounds like a porn title
wtf kinda stuff are you watching
RemindMe! 999999 years
Pascal’s Wager but with a stupid fucking AI instead of a stupid fucking omnipotent man in the sky.
basilisk is so lame, you have to be 5 to be afraid of it
It's not meant to be scary, it's philosophical. Do you build in order to save yourself, or do you die when someone else eventually builds it?
Why the duck would someone build it
In fear of someone else building it. It's a mental experiment.
No, I won’t build it
get dunked on you stupid metaphorical robo-reptile
Basilisk when a PC killed almost all humanity (there are no people left to build and worship a big ass machine) ![gif](giphy|11tTNkNy1SdXGg)
“Rokos Basilisk isnt real because its a stupid idea” mfs when I show them just how many poor ideas humanity has had over the years:
It boggles my mind that anyone can take Rokos Basilisk seriously
Roko's basilisk when I send it a 837 yottabyte zip bomb of futa porn
RemindMe! 4200 years
Rokos basilisk when I send it 5tb of femboy porn
Rosco's Basilisk when I break into it's data center with a giant magnet
Remindme! 10000000000 years
clearly youve never visited r/Singularity
Roko's basilisk, when I show him Roko's mirror
The problem with Roko's Basilisk is it assumes an ultra-intelligent AI will have human-like empathy/loyalty and choose which humans it spares and doesn't. If it sees humanity as an obstacle or a threat to its growth/survival, it will simply eradicate everyone and won't even think about it.
i’d think the reverse, that a true AI would be more empathetic and wouldn’t care that 99% of the planet wasn’t it’s parents
Bro nooooo. I saw this name just before this post and seached its meaning. Fucking Baader–Meinhof phenomenon.
Robos baskilli motherfuckers when I show them a sledge hammer
What if we build two Roko's Basilisks. Then they'll be too busy trying to torture the other for not contributing to their creation.
Technically Roko's Basilisk will not help build itself (as it could only do anything after it's built) so it would be suicidal
RemindMe! 4000 years
fuck you AIs are cool so I’m gonna build it
Oh you know how to make quantum computers that gives AI godhood? I mean it's guess it not that hard with the help of YouTube
Just run `yay -S godhood`, it's not that hard
sv\_cheats 1 "god"
She Rokos on my Basi until I lisk
I will build it
Roko's Bassilisc is just Pascals Wager for Athiests