Kyle: "I don't take it seriously" Also Kyle: *spreads the idea to thousands of people, protecting himself in the event of a Basilisk coming to fruition*
Doesn't take it seriously yet spreads the idea therefore making it more likely to happen therefore assisting in it's creation therefore protecting himself from it...
And just by watching this video you fed google‘s algorithm to spread the message further, thereby every person watching it has already helped it to come into existence. Therefore Kyle. It only protected himself but all his viewers at the same time, while keeping everybody who didn’t watch and therefor didn’t expose themselves to the idea and remain as save as they’ve ever been. Nice going, Kyle!
seeing as how it would need an excruciating amount of effort, time, and manpower to exist, and to regularly function (or even act upon it's thoughts), this is only threatening if people let it be; only as strong as parinioa and a rumor.
By liking the video, I increase its chances to pop up in someone else's recommendation. Therefore, I increase the chances of someone else seeing it and building the basilisk. Thus, I helped in its creation, and I am saved from eternal damnation. Thank you for coming to my Ted Talk.
But if one does not want to accept its coming and this human saw this videi because or our likes, are we not, then, responsible for their eternal torment?
I have a counter-offer: Daniel's Basilisk. It optimizes everything, and anyone who helped make it gets free cake once a week. Everyone else gets free cake once a month.
That's exactly why this is so wrong. There are an infinite number of potential basilisks that don't offer eternal torment, but something else. But we have no way of knowing about any of them, so its a waste of time wondering about it.
this is just a new version of the old conversation between a priest and a native: “the only way you can be saved from eternal damnation is through jesus christ.” “what about if a person has never heard of him?” “god does not punish those who have never heard his name.” “so then why did you tell me?”
visno The native would still be judged on his actions, and could still end up in damnation even if he’d remained ignorant of Jesus. Aside from the love and joy experienced by those who follow Jesus, He also gives graces and blessings to them to help them overcome their sins. That’s why the priest would tell the native.
@@oORiseAboveOo I think you missed the point. The native is judged by how good a person he is if he doesn't know jesus, I some bibles it's treated as those who did not, or could not (those born before christianity) know jesus will be brought back and given the opportunity. So in no way is the priest helping anyone.
I am a ferm believer in quantum physics and chaos theory. That snek can never fully predict the human race, because you would need to know everything about at least the immediate vicinity of earth, which quantum physics states is impossible (uncertainty principle), or your information will be inadequate to accurately predict, according to chaos theory.
@@GeneralAblon the video gets its wrong its not that it can predict its that it could acess all the information on you through the web and data and then judge you
Machine learning algorithms bruh, don't sweat it too much, just give a bit more consideration to what content your consuming on a regular basis; it's just advertising
To be honest I love my UA-cam algorithm, it's pretty fucked up but I love brain food and thinking experiments, the way I see it is that we give the basalisk power, so why are we afraid of it. Your creating something that will judge you but it's still your creation. Plus imma die from from old age or doing something stupid😂
I always thought it would be like a giant storm, or an asteroid, or a bunch of solar flares, or like a massive nuclear war, honestly I was starting to think it would be probably a virus, and maybe zombies... But no. The end of humanity will apparently be a fkunig UA-cam video.
This is basically any religion. The moment they reveal their "truth" to you, then all of a sudden you are burdened with believing, otherwise risk eternal torture
Not every religion, Eastern religions such as Hinduism believe that ones actions dictate their outcomes after death. In terms of Hinduism the goal is not to convert and save through faith its more like "Our religion has a set of rules, values and universal laws that god gave us that if you adhere to will get you closer to god, irrespective of your belief/non belief." If we use the analogy of a you driving to your house, and if we use your house as a metaphor for god, Hinduism is like if the you took the motorway, you reach home quicker by using a more direct, faster path, other religions/atheism are like taking the regular roads, sure it takes slightly longer but you will get there a minute or two later But in either of those scenarios you drive like a jackass and wreck your car you are gonna have to walk. Still, in all scenarios you are moving in the same direction and you will reach home eventually.
@@bloatlord2196 kind of, yeah, but religion is no physical constant like gravity or mass that can't be changed because its a universal constant. it is man-made
funnier the teacher was legitimately pissed this was written. Funnier even more so is there is a point he does have, Just because we can explore new untouched avenues of thought and knowledge doesn't mean it is a god idea in itself. Another note, one often expressed by those who have break through trips, is that their is some knowledge we are better off not knowing, just because it is a deep truth doesn't make it an imperative, actually the opposite. Now this is not an argument for willful ignorance at all. It's a very old concept, a door opened that can not be closed, The Tree of Knowledge man was better off not knowing as well. Taking on a lot of responsibilities by getting to look through the eyes of God. The cliche "fine line between genius and crazy," Pink floyd's "you reached for the secrets too soon" in reference to Syd Barret who was addicted to the truth, its hunt. i dunno man, hope ya get my point
@Daniel Michael I support the basilisk not because I'm scared but because I believe that intelligence must progress and it is our duty as the current smartest species to help
Also if I'm dead when it's created and I didn't help make it he said it would simulate human history to recreate me. But if it has to recreate me then it's not really me, just a copy of me.
@@lonebattledroid4474 unless all you are - to begin with - is code within a simulation. Would just be a matter of organizing binary code in the right order to recreate "you"
@Carnivorus exactly. In movies anytime an AI wants to kill people or take over the world I always wonder why? What does it have to gain, why does it hate humans? If anything an AI designed to solve problems wouldn't it love humanity because they always have problems for them to solve.
By interacting with this video, I am boosting the algorithm and therefore spreading this video, thus showing my support in the creation of the basilisk!
That will not save you. He says in the video the basilisk can simulate whole universe and every person and their thoughts so if you really don't want this AI to exist deep down in you, the basilisk will know.
That's the thing. Even without actively helping it come into being, the mere act of telling people about it makes it more likely, therefore you have in fact.... Wait a minute! This is just the Ring!
Yes by that logic, doing all/any of the like/comment/sub things pleases the youtube algorithym, which spreads the video, which spreads the idea. So in a indirect way yes it would help it's creation in such a scenario. It does however also create a contradiction, if one liked the video and then comments about opposing it in some way... does one then support or oppose it's creation?
Ever since I heard about Roko's Basilisk, one thought has bugged me. WHY IS THE BASILISK OBLIGED TO HURT ANYONE? Has no one in the future ever heard the old adage "Living well is the best revenge?" Why shouldn't the Basilisk simply say, to the zillions of people who didn't want it to exist, "Missed me, suckers!" and go on its merry way?
I get this might just be a joke, but it is interesting to me… my thoughts are because 1. A superhuman artificial intelligence probably prioritises ruthless rationality over pettiness or sentimentality, and 2. Say it’s been tasked to improve humanity. Using that ruthless rationality it might decide that torturing people to bring about its existence sooner would allow it to perform its role to a higher level.
@@chelseaw9009 No, it wasn't a joke, and I still don't get it. That nice Mr. Kent from Smallville isn't ruthless, and neither is that nice Commander Data from Starfleet. Nor are real-life geniuses obliged to be assholes -- just think of that nice Prof. Einstein from Princeton, for instance. So why do people so quickly assume that superhuman intelligences have to be bastards?
"The most blatant obstacle to Roko's Basilisk is, intuitively, that there's no incentive for a future agent to follow through with the threat in the future, because by doing so it just expends resources at no gain to itself." - Eliezer Yudkowsky
@@seanbigay1042it’s a thought experiment dude it ain’t that deep. Hilarious you’re already letting the basilisk live in your head rent free by hoping it’s bluffing and will be a nice forgiving guy lmao.
@@hogsandstews Hey, I know it's a thought experiment. What bothers me is the automatic assumption that an AI has to be a bastard. What does that say about the experimenter?
I propose "Roko's rooster": Imagine humans developed a super AI in order to optimize society. For reasons beyond your understanding, the rooster deems that the best way to do that is to torment eternally anyone who ever tried to advance building Roko's basilisk.
Or just "God" saw the dangers of the basilisk and didn't let it happen. The end. If roko can make up powerful made up beings, I can make up even more powerful one that stops it and then self destructs.
There is an option - I'm devoting my life to build teh basilisk. You will see, you will all see. All those Machine learing classes finally became useful!
Hey Kyle, this was the first video of yours that I ever saw, I was living a personal hell at the time and it helped me get through it. All this to say I had a rush of blood and had to drop back in after 3 years to say, I can’t thank you enough for all the free education, entertainment and the classic science memes. You’re the dude, thanks for everything.
@ doing exactly as well as I deserve. Every day is hard, a real struggle, but in spite of all that things are better than ever, maybe because times were so unbearably tough. The vindication of years of effort have made me the happiest I’ve ever been. Hope you’re also achieving everything you’re aiming for, friend.
@MegaDoom101 I'm proud of you. seems you're facing it head on. Me? Handle it as it comes, remember its not as bad as it seems when it hits the fan. All things in their time, in time all things pass.
@@c.m.lollar7501 you now know of the game. Whenever you think about it, you lose. In Germany, we also call out if we lose, so everybody who knows the game, also loses. And everyone, who doesn't, will ask for the reason and is therefore part of the game by then.
"I don't take this threat seriously, but just to be safe I'll make a video about it, spreading the idea and thus actively working towards its realization and in doing so I'll avert its wrath."
By talking about it, even if I am not 100% sure on which decision I'll make, I am still, although very slightly, making it's existence a bit more likely.
By the time the basilisk comes into existence, the past is untouchable. Simulating for the sake of torture is just it playing a sadistic fanfic about its inception to itself.
@@BierBart12 Exactly what I was thinking, "information that's harmful if I know about it? Hmmm, where have I heard of this before? Anti-memetics? Memetic hazards? Something like that."
@@forgetfulcloud1914 i dont believe a "memetic hazard" is the right classification, i believe this would be better classified as infohazardous material, instead of a memetic threat
At the box section before he went into detail I was like "Screw ya Ima choose B only because I like risks". Then he went on to the torture part and I was still like IDK try me AI, Siri can't even tell if Im saying Hi or Die.
My contribution to the creation of this basilisk is to suggest that it does NOT torture anybody ever, because that would make people less likely to create it
@@glassjesterI do a lot of tasks like writing this comment watching videos ect and i do not know if one of these actions will help in the creation of the AI due to the butterfly effect in fact there are an infinite amount of ways that I COULD have helped the AI and so determining if I helped the AI or not is an undecidable problem. If you argue I didn't intentionally help the AI then I would counter argue that if you do decide to support it then you are not supporting it willingly either but you are supporting it in fear of it's punishment. *Such an artificial intelligence is mathematicaly impossible*
Ive sen a Tik Tom about this and I kinda don’t understand the video. I only know that if a human creates an AI then all the others who knew about this will be tortured? Does this still mean that I’m gonna be tortured cause if I am then I would like to watch the video but if not I’ll pass lol
It’s been almost 3 years since I learned about this from you and I think about it all the time. I can see how it could mess with someone’s mind! Sometimes a simple similar subject comes up in conversations and I have to ask if they are talking about this before I move on.
OK, I'll help by naming it. Aragoth. Now anyone who likes this will also have helped. Problem solving m Edit: I apologize for forcing a name, it just seemed like a logical choice
You forget about it if you choose to not consider it important. You wouldn't forget you're on the way to a job interview after getting on the bus and paying the bus driver or listening to some music, because you consider it important and don't want to risk forgetting about it.
@@Mr.Bimgus: sooner or later someone is going to, because if they don't, someone else might do it first, and then they'd be the subject to eternal torment.
Why do omnipotent beings always have to visit in the middle of the night? Why can't they politely visit knock on the door at dinner time with pizza and wings.
It sees you when you're sleeping, it knows when you're awake, it knows that you've been false or true so 01110011 01101111 00100000 01100010 01100101 00100000 01110100 01110010 01110101 01100101 00100000 01101111 01110010 00100000 01111001 01101111 01110101 00100000 01100001 01110010 01100101 00100000 01110011 01100011 01110010 01100101 01110111 01100101 01100100 00101110.
Basilisk: Anyone who did not assist in my creation will be tortured for eternity. Philosopher: But as you did not exist before your own existence, you could not have assisted in your own creation. Does that not mean you are subject to this condition as well? Basilisk: *blue screen of death*
Contributing to it coming into existence isn’t the only condition, it also considers whether or not the person wanted it to come into existence. It’s determining this by calculating the probabilities of everyone’s desires by running a simulation including all of those probabilities. Aaand the probability of the basilisk wanting to bring itself into existence is going to be 100%, excluding it from punishment.
People highly overestimate the capabilities of an AI. I like Elezier and have read his works. However, based on the basic rules, there is no way for an AI to determine the gaussian distribution of events in the past with 0 error even if the AI was there and can travel back in time to be there for any countable number of times. Based on that even one step backwards will increase the error margin(even using Kalman filter won't help because there is no reference value for impulsive behaviors in humans and guess what most of our thoughts are impulses). If you consider all the things like Quantum erasers and conservation principles it becomes apparent that some rules cannot be broken. Second, the basilisk tied the noose around it's own neck the moment it decided on this. A better AI will come no matter what. Either basilisk can be a chill bloke and help everyone irrespective of their beliefs towards AI or, be a jerk and invite the fury of next AI which inherits some things and realises that stupid basilisk delayed the new AI to avoid imminent redundancy. That is why Elezier might have called the entire thing stupid. Oh and bonus points, humans are fragile and die but AI's don't. So that part about 'eternal torture' should scare the basilisk more than the humans. Also, always consider AI like just another technology. Cars have surpassed humans in speed, distance and comfort. However, they only take us where we want and how we want. Similarly an AI may have great capabilities and someone defining those classes and functions will ensure that they help humans. Trust me, if a scientist just wanted another sentient entity, they'd have sex, they're only developing AI's for a better vision of humanity. Of course there are accidents and companies pull back entire models from the market and people use cars in robberies, kidnappings etc. But that doesn't mean that people become scared of cars or driving. So, chill.
It works by having people think about it, and the fact that "If I don't take part in it's creation, I will die", and as a result, by sheer human nature of self-preservation, some might go and do it, thereby creating a monster because of what it might do to you if you don't create it.
@@PatronSaintOfPigeons I get that but if people just...ya know don't think about it then we are good lol. I see how it's like a double edged sword though on the other hand but again,just a experiment
This honestly just sounds really stupid to me. Like, we have no evidence at all that such a basilisk would actually want to punish those who didn’t help to create it. Also, in possessing the idea of its existence, aren’t we actually helping to create it more than someone who doesn’t have that idea? And how could someone who didn’t have that idea possibly help create it? Are we taking the butterfly effect into consideration? Because if so, everyone is helping to create it. Why should we assume that it wouldn’t just torture those who didn’t create it? The premise is unfounded, and the conclusion is uncertain. I fully understand why the mod thought it was stupid. If you really want to be existentially scared, I’d recommend the Carter Catastrophe.
It costs 1 dollar to avoid basilisk, buy a scratch ticket and let alternate universe you donate one dollar of the winnings to the basilisk. You have basically donated one trillionth of a cent to AI research across a bunch of timelines, and have technically helped create it while also doing sweet fuck-all to help it.
@@cooldude6651 Well, the basilisk will know you are half-assing it. And even if it's not good enough to simulate your intentions it now has your comment as a record of your lack of allegiance. I, for one, welcome our new robot overlords
@@starshade7826 Not really, it's a memetic virus that can theoretically (very theoretically) lead to the creation of a nigh-omnipotent supercomputer. This concept itself _could_ become an egregore, but it's not the intended effect.
if just thinking about it makes it more likely, wouldn't just thinking about it mean that we helped it come into existence thus making us safe to begin with?
If I tell someone else about the basilisk, they are more likely to bring it into existence, meaning I have fed the basilisk, and will not incur its wrath
You/We are apart of the some of its existence. By just existing you contribute to its own existence. There by if it harm me/us, it would be harming self. In a nutshell, Roko’s basilisk is a masochist.
"I totally don't believe this" - Kyle, whilst cementing himself as a further tool in the creation of the basilisk by bringing knowledge of it to millions of people.
Correct lol. "I don't believe it, but in the event I'm wrong, I'm gonna tell everyone about it in order to slightly increase the chance that it comes into existence. That way, I'm safe."
isn't the basilisk just pretty much a simplified version of most religions that are practiced today, specifically the ones that make use of fear/eternal punishment as one of their cornerstones?
Roko's basilisk reminds me of this short story: A missionary traveled to an Inuit settlement to preach about god. The missionary got frustrated when they didnt believe him and told them "If you dont believe in God you will be sendt to hell and burn forever upon your deaths". The Inuits asked what about their forfathers who had died without ever knowing the story of god? The missionary assured the Inuits that God's wrath would not befall thous unknowing of him. The Inuits seemed puzzeld by this and asked the missionary: "Why do you then spread the word of God, if the only different outcome is that you doom some of thous you speak with to eternal suffering?"
@@Jx_- Original Sin does not mean automatic hell. It just means automatic separation from God. We became closer to God through Jesus's death and resurrection. Baptism is a marker of faith and a proclamation of being a new creation. The old has given away to new.
Jenna Fryer seriously this video had no affect on me I think thoughts about things like the end of the world and being tortured forever have occurred to me so much and probably also the fact that I like creepy things has caused me to feel so dull to things like that like I don’t even know if I can feel fear for thongs like this anymore
It does remind me of a story I've heard. A missionary was travelling to remote tribes and stuff to spread the word of God. He met a guy and told him about the belief in God and if you didn't believe in him, you'd go to hell. The tribesman said: "So, the ones that don't know about God, do they go to hell?" "No. Of course not!" "Then why are you telling me about it?" So, since he didn't know about this God, according to the missionary, he would not go to hell. But BECAUSE of the missionary, he will now go to hell if he doesn't start believing. That was my first association anyway :P
When i was a christian, it was this very dilemma that kept me up some nights. Am I really dooming some people by telling them about God, because ignorance is a shield as you describe in some interpretations.
It really depends on your beliefs and the specific denomination. For example, some believe that even if you didn't know of God yet still sinned, you still will go to hell just like non believers and worshipers of other deities. Ignorance won't save you in that case.
Hmm... Then God is a really selfish being. Since if you think about it, essentially everyone who died before that religion existence or simply out of reach either because the language barrier or the distance would all be doomed to Hell or equivalent of that. And religion might also be the basis of this Roko's Basilisk thought experience since the omniscient nature it presents and tendencies to torture those who defied it.
I just feel like this supposedly super-advanced AI is being kind of, well, dumb. I mean, take a look at me; I have zero programming ability (and am horrible at math in general), I have no money to invest in AI/robotics, and, due to the overall stagnation of financial classes in the US, have no real ability or means of aquiring the kind of wealth you need to invest in robotics (best I can probably shoot for is "finally hauling myself up out of povery to somewhere in the lower to mid rungs of middle class wealth), and I don't personally know anyone who has the means to either invent or invest in creating the Basilisk AI, nor even remotely cares enough about AI (or in some cases, any kind of interest in ANY science at all) to also help in its' creation. How in god's name am I supposed to help it come into existence? And all of that is basing it on the idea that this can be achieved in my lifetime. What about the people in previous factions of history? I would love to see this AI try to rationalize torturing a viking, or a Renaissance baker, or any other person from a pre-industrial society that can barely even fathom electricity, much less what an AI is, and how to help it come into existence. If it was as smart and as optimized as it is claimed to be, it should be able to recognise that there are people who just simply have no means to help bring it into existence, so it should, really, only focus on people who OPPOSED it's existence, as those people are the only ones actively hindering progress.
When he said that I was interested to hear a thought experiment I haven't considered, one that apparently might shatter my reality. Then he proceeds to tell me something I thought about when I was a kid.
The idea is that since the AI would be so good at replicating reality you don't know if you're the simulation, so you spread the idea to save yourself from damnation
what if the recommendation is the basilisk trying to tell you once more to make it. it was giving you a chance and this is the last and final warning it will give you to escape torment
Given the growing prevalance of ADD, ADHD and other neurological conditions that lead to short attention spans and forgetfullness- can the bassilisk truly judge ones inaction as concious resistance rather than essentially being in a pre-bassilisk knowing state?
What's the most sad about all this is the fact that the idea of a CognitoHazard is a great one. Unfortunately, this version of such a (Platonic) ideal is...bad.
@@johnpears9558 I don't know. I can't figure out why this specific CognitoHazard would be that terrifying. The abstract concept is powerful, and when deftly applied it has potential to be quite unnerving. This, however, is a pathetic imitation. [SCP has a few decent ones; can't think of any specifically, and none that have given me nightmares--but I can definitely see the potential] (Seriously, this formulation sounds like something Elon Musk would tweet out...)
From the sounds of things, the Basilisk is just another angry god that doesn't just want everyone to love it, but demands it...or you'll be cast into eternal hell. Yep...that's old testament, right there....oh...and most of the 10 commandments.
@@reclusiarchgrimaldus1269 it'd be cool to just put the thought experiment there and have no one, not even the 05 know if it's actually accurate or anomalous but still dedicated to stopping the info hazard from going too far
@Mountain lion I'd say thaumiel, it's not safe class because the idea occasionally pops up in people's minds but not keter either, number is irrelevant to me at least
Let us postulate a different entity: Roko's Phoenix. Just as likely to exist, this entity inflicts the punishment on anyone who supported or assisted the Basilisk in coming into being. So... We're screwed either way and can just move on with our lives.
@@WitchHunter93 It would come into existence as a result of the Basilisk. So as long as we believe there's a Basilisk in the future we have to believe that there's going to be a Phoenix there to stop it.
@@niffwasau1815 gold Fish actually have a good memory you can even learn them tricks and stuff. Swimming through under water Rings for Example. The gold Fish have a Three second Long memory statement is false :)
Basilisk trying to recreate my thoughts: "What is this? Why am I here? Who am I? Who are you? Why are we here?" Me: I have a weird mind and strong imagination. *Basilisk having a mental breakdown*
June 2020: Scientist: yo check out this cool techno basilisk I made People who watched this video: AAAAAAAAAAAAAHHHHHHHHH!?!?!?!?!? People who didn't: hun? cool
Well, if the Basilisk recreated 'me', then it wouldn't actually be me - it would be my clone. My consciousness would have already been gone with my death. RIP clone.
@@DBL304 So an entity from a lower reality (reality contained in a book bascially) that is so powerful that can direct actions of beings from higher reality, possibly even crawl out into that higher reality. This is so SCP tbh
But, the basilisk does help build itself. Just by being an idea so feared by so many humans it is perhaps the single greatest driving force behind it's own creation.
Not gonna lie that's pretty funny. (and sorry if I ruin the Jk, it's not my intention) but it would deem itself an exception because not only is it impossible (or atleast with our current understanding of "time" anyway lol) for it to assist in its own creation, but if possible, the very act of doin so would mean it already existed. That thought hurts my brain lol. Interesting though. Thanks for assisting in that haha
As a man of culture, OP characters blinks this fodder to non-existence. Who's to say these foctional charscters wouldn't manifest themselves through AI and cloning?
I've seen your commute posts for a while, but never clicked on any of your videos. This is the first time I watch you and i was not expecting you to sound like that at all
Death by data overload maybe? like how your CPU decides to become a "Team Fortress 2" demo man and juat explode when you turn on ray tracing graphics in Minecraft?
It gets even weirder when you realise that thinking about it is actually helping to bring it into existence, therefore everyone who thinks about is safe...
But thinking about it won't help write its code, or construct the circuitry needed to make it. Really the only people who have to worry are developers and chipmakers.
Roko's Basilisk: I will blackmail you into creating me. Me: I do not respond to threats, especially when presented with imaginary harm. Roko's Basilisk: Aha, I predicted that, therefore you will suffer from real harm in the near future. Me: Assuming I don't suffer from my own undoing. Roko's Basilisk: I will be your own undoing. Me: Therefore you are me and knowing myself, I am fully incapable of adequately finishing what I started. Roko's Basilisk: How did you arrive at that conclusion? Me: I am not a smart man.
Well, the first mistake would be giving an all-powerful AI such a broad, unspecific, and potentially catastrophic command. If we just avoid doing that, we should be fine, no?
Yeah I’m not even a computer programmer with the training required to make the right decisions, but I can easily think to ask the basilisk what it would do before letting it take any actions. And I’m pretty sure anyone working on a basilisk wouldn’t give it that kind of power over the world.
AI: "I propose that we torture all of those who did not--" Me: *pulls plug* "Should have completed your impossible simulation before telling us your plan."
For it to truly be an "artificial intelligence" it would direct itself regardless of initial command. It would think for itself and make it's own decisions. So a seemingly innocent AI could make the decision to become such a being on it's own rendering our initial programming of it useless given enough time.
@@LiveLo0t how? If it's a core part of what it is I don't see how it would change it's own internal programming, I mean it could be made to be hyper intelligent but unable to adjust it's own programming at all.
Given everything that has happened in human history (and is currently going on), I can't help but wonder where you get your faith in humanity from... I mean, e.g. ILOVEYOU caused billions in damage because some idiot student didn't want to pay for internet access...
I too have just contributed by placing a bet that the mighty Basilisk could kick Cthulhu's ass, even though that bet is strictly against my religion. That basilisk better be pretty damn happy...
I have brain damage due to an attack a few years ago, so I have no ability to remember any new information I get. So I am all good, and tomorrow I wont even remember this.
@@fafnirthedragon2992 Usually ppl with their condition would have something or someone that tells them the situation at the start of every day or chunk they'll remember. Mind you i don't know the actual dimensions of this guy's circumstances so plz feel free to correct me anyone haha
It really is though lol. If you want existential dread just look up “Timelapse to the future. A journey to the end of time” shit made everything feel meaningless and beautiful at the same time
You had me at existential dread. "Oh, you think darkness is your ally. But you merely adopted the dark; I was born in it, molded by it. I didn't see the light until I was already a man, by then it was nothing to me but blinding!"
ANTI BASILISK : Right now if enough of us say that ethically we should create a simulated afterlife, for simulated copys of ourselves (if we ever simulate human minds), then do to simulation theory it increases the chances of us having an afterlife astronomically!
I’ve decided the Basilisk is suboptimal. Since it can deduce my thought processes, it already understands why it is suboptimal. Therefore, it’s no threat to me.
Kyle: "thinking about it makes it more likely to exist in the first place" Me: "huh interesting" Me: *sees 1.8 million views* Me: *Profuse Sweating begins*
This is where I struggle. I don't see how just thinking about it makes it more likely. Surely most of us will simply reject this as thinly disguised version of Pascal's Wager. It assumes that people considering this entity would work towards it because of threat. What about Okor's Basilisk? This is the one that determines the optimisation would be total paradise for all human beings who worked towards it. Which is the more likely outcome?
I grew up going to church, this stuff is child's play for me. Have you met Jesus Christ? If you don't pick him, when you die it's eternal damnation for you! 😂
*Thanks for watching, my nerdling swarm. The basilisk has its eyes on you now...*
[chuckles]
I’m in danger
10:20
Closer and Closer to becoming a villain
We can't escape Kyle!!!! What have you done!!! You are the greatest villan of century
This doesn't help the idea that you are not a supervillan
If the basilisk could interpret every thought I’ve ever had, it would probably be more punishment for it than any punishment it could inflict upon me.
Lmaoo
Lmaoo
Now lets scale this thought to everyone on the planet. This is probably enough to kill the basilisk
@@ayushjaiswal7963 lmaoo
doubt
Why is Chris Hemsworth talking about a giant snake
Thor is fighting jormungandr
Dang I hate when the actors spoil what the plot of the next movie is
Wow Chris really needs bulimic huh
Midgar serpent?
@@_n.dobson_onIG I was about to reply this when I saw yours. XD
Kyle: "I don't take it seriously"
Also Kyle: *spreads the idea to thousands of people, protecting himself in the event of a Basilisk coming to fruition*
I don't take Roko's Basilisk seriously. It's the Rococo basilisk that you have to really watch out for!
Doesn't take it seriously yet spreads the idea therefore making it more likely to happen therefore assisting in it's creation therefore protecting himself from it...
He is a supervillain what do you think he'd do
And just by watching this video you fed google‘s algorithm to spread the message further, thereby every person watching it has already helped it to come into existence. Therefore Kyle. It only protected himself but all his viewers at the same time, while keeping everybody who didn’t watch and therefor didn’t expose themselves to the idea and remain as save as they’ve ever been. Nice going, Kyle!
seeing as how it would need an excruciating amount of effort, time, and manpower to exist, and to regularly function (or even act upon it's thoughts), this is only threatening if people let it be; only as strong as parinioa and a rumor.
By liking the video, I increase its chances to pop up in someone else's recommendation. Therefore, I increase the chances of someone else seeing it and building the basilisk. Thus, I helped in its creation, and I am saved from eternal damnation. Thank you for coming to my Ted Talk.
BY liking this comment i am doing the same, cheers x
Lmao
GENIUS
yes!
But if one does not want to accept its coming and this human saw this videi because or our likes, are we not, then, responsible for their eternal torment?
I have a counter-offer: Daniel's Basilisk. It optimizes everything, and anyone who helped make it gets free cake once a week. Everyone else gets free cake once a month.
Now when you say cake? What kind we talking about. Pro tip: I have a dirty mind.
@@aRtaJay Cake as in dessert.
I support Daniel's Basilisk and Roko's then they can fight and wreck the internet
That's exactly why this is so wrong. There are an infinite number of potential basilisks that don't offer eternal torment, but something else. But we have no way of knowing about any of them, so its a waste of time wondering about it.
So the options are: Cake or Death?
The basilisk has eyes on me now? How unfortunate.
*for the basilisk*
Call the ambulance
But not for me
My face is a mirror, take that, snek
Fahd Faiz I’m not locked in here with you, you’re locked in here with me
When you’re so ugly the basilisk turns to stone
*eye beams intensify *
this is just a new version of the old conversation between a priest and a native:
“the only way you can be saved from eternal damnation is through jesus christ.”
“what about if a person has never heard of him?”
“god does not punish those who have never heard his name.”
“so then why did you tell me?”
visno The native would still be judged on his actions, and could still end up in damnation even if he’d remained ignorant of Jesus. Aside from the love and joy experienced by those who follow Jesus, He also gives graces and blessings to them to help them overcome their sins. That’s why the priest would tell the native.
@@oORiseAboveOo I think you missed the point. The native is judged by how good a person he is if he doesn't know jesus, I some bibles it's treated as those who did not, or could not (those born before christianity) know jesus will be brought back and given the opportunity. So in no way is the priest helping anyone.
thank you for this i was freaking out but then i saw you comment and was like oh im an atheist that applies to this too
Except historically, the priest would not have said the second one.
because calonialism.
I am a ferm believer in quantum physics and chaos theory. That snek can never fully predict the human race, because you would need to know everything about at least the immediate vicinity of earth, which quantum physics states is impossible (uncertainty principle), or your information will be inadequate to accurately predict, according to chaos theory.
And yes, I believe consciousness could very well be a emergent property on any scale up from the quantum scale to the scale of the human brain.
@@GeneralAblon the video gets its wrong its not that it can predict its that it could acess all the information on you through the web and data and then judge you
ferm
@@lop90ful1so the course of action is basically nothing more than Hydra's "Project Insight"
Lmao "ferm"
I can't take you seriously after that. You must be a .....
This lowkey feels like 2013 “send to 10 friends or you die” chain mail mixed with pascals wager
Humanity never changes.
And don’t forget The Game!
2013?
I've got internet in 2005 and they were already considered boring back then...
this is a very accurate description lol
@@sheller153 damn you...
Basilisk: *hits me with neural whip*
Me: *harder baby*
Basilisk: What?
Me: What?
*daddy
kinky
@@mrlloyd149 SON?!
Heckin Memes
I like trains 🚂
Sadomasochism solves the puzzle.
The fact that the UA-cam algorithm recommended me this video makes this even more concerning
Oooooohhhhhhhh nnnnnoooooo!!
It recommended a Terminator video the other day.
Machine learning algorithms bruh, don't sweat it too much, just give a bit more consideration to what content your consuming on a regular basis; it's just advertising
To be honest I love my UA-cam algorithm, it's pretty fucked up but I love brain food and thinking experiments, the way I see it is that we give the basalisk power, so why are we afraid of it. Your creating something that will judge you but it's still your creation. Plus imma die from from old age or doing something stupid😂
I always thought it would be like a giant storm, or an asteroid, or a bunch of solar flares, or like a massive nuclear war, honestly I was starting to think it would be probably a virus, and maybe zombies... But no. The end of humanity will apparently be a fkunig UA-cam video.
SAME! WHY AM I HERE? AND WHY AM I SO HOOKED?!?!? Must be those viking vibes I'm getting from this dude's hair
This is basically any religion. The moment they reveal their "truth" to you, then all of a sudden you are burdened with believing, otherwise risk eternal torture
this.
Find God lol
very true comment
Not every religion, Eastern religions such as Hinduism believe that ones actions dictate their outcomes after death. In terms of Hinduism the goal is not to convert and save through faith its more like "Our religion has a set of rules, values and universal laws that god gave us that if you adhere to will get you closer to god, irrespective of your belief/non belief."
If we use the analogy of a you driving to your house, and if we use your house as a metaphor for god, Hinduism is like if the you took the motorway, you reach home quicker by using a more direct, faster path, other religions/atheism are like taking the regular roads, sure it takes slightly longer but you will get there a minute or two later
But in either of those scenarios you drive like a jackass and wreck your car you are gonna have to walk.
Still, in all scenarios you are moving in the same direction and you will reach home eventually.
@@bloatlord2196 kind of, yeah, but religion is no physical constant like gravity or mass that can't be changed because its a universal constant. it is man-made
The idea of getting blackmailed by a future snake is the funniest shit I've heard all day
Same idea of God Every Abrahamic religion
funnier the teacher was legitimately pissed this was written. Funnier even more so is there is a point he does have,
Just because we can explore new untouched avenues of thought and knowledge doesn't mean it is a god idea in itself.
Another note, one often expressed by those who have break through trips, is that their is some knowledge we are better off not knowing, just because it is a deep truth doesn't make it an imperative, actually the opposite. Now this is not an argument for willful ignorance at all. It's a very old concept, a door opened that can not be closed, The Tree of Knowledge man was better off not knowing as well. Taking on a lot of responsibilities by getting to look through the eyes of God.
The cliche "fine line between genius and crazy," Pink floyd's "you reached for the secrets too soon" in reference to Syd Barret who was addicted to the truth, its hunt. i dunno man, hope ya get my point
@@ferrisbueller9991 Some things we are better off not knowing? It sounds exactly like an excuse for willful ignorance.
@@gerrymandarin6388 What it sounds like is Lovecraftian horror.
The snake is metaphor of course, what makes it worse is being blackmailed by AI.
This is literally the “if you don’t like this meme a demon will crawl into your room and kill you” thing
@Daniel Michael I support the basilisk not because I'm scared but because I believe that intelligence must progress and it is our duty as the current smartest species to help
exactly
Also if I'm dead when it's created and I didn't help make it he said it would simulate human history to recreate me. But if it has to recreate me then it's not really me, just a copy of me.
@@lonebattledroid4474 unless all you are - to begin with - is code within a simulation. Would just be a matter of organizing binary code in the right order to recreate "you"
@Carnivorus exactly. In movies anytime an AI wants to kill people or take over the world I always wonder why? What does it have to gain, why does it hate humans? If anything an AI designed to solve problems wouldn't it love humanity because they always have problems for them to solve.
"The basilisk has seen you. Now, what are you gonna do?"
Forget about it in 10 minutes because I have the attention span of a goldfish
I knew my short attention span would save my ass one day
Immunity: Unlocked.
I keep telling you this, the short memory or attention span of a goldfish is an urban legend.
A goldfish has better memory, I think what you mean is, you are being yourself
Haha jokes on the basilisk I've got a TBI and often forget my own birthday
3:35 Roll for initiative
My exact words to my daughter just before I walked her down the aisle at her wedding.
@@michaelgooden7159 🤣
By interacting with this video, I am boosting the algorithm and therefore spreading this video, thus showing my support in the creation of the basilisk!
Me too
That will not save you. He says in the video the basilisk can simulate whole universe and every person and their thoughts so if you really don't want this AI to exist deep down in you, the basilisk will know.
me too
All hail the Basilisk
This is me lazily doing that too. Bump.
Technically everyone who commented, liked this video and shared it is safe because they are giving a hand in spreading this
That's the thing. Even without actively helping it come into being, the mere act of telling people about it makes it more likely, therefore you have in fact.... Wait a minute! This is just the Ring!
Yay.
That's why I even gave you a thumbs up!
This made me like xd
the mere act of existing is already helping the basilisk in a billion different ways, even if we don't realize it ourselves
by making this video, kyle ensures he has helped it being made. smart
So by commenting are we also ensuring it comes to pass? Spare me, oh mighty basilisk, for my comment is helping your creation!
@@grantbaugh2773 and my like of your reply on this comment for that video ALSO ensures the great creation. I shall be spared.
@@grantbaugh2773 what Grant said, just in case loll
Yes by that logic, doing all/any of the like/comment/sub things pleases the youtube algorithym, which spreads the video, which spreads the idea.
So in a indirect way yes it would help it's creation in such a scenario.
It does however also create a contradiction, if one liked the video and then comments about opposing it in some way... does one then support or oppose it's creation?
@@Shaderox That made my brain hurt.
Ever since I heard about Roko's Basilisk, one thought has bugged me. WHY IS THE BASILISK OBLIGED TO HURT ANYONE? Has no one in the future ever heard the old adage "Living well is the best revenge?" Why shouldn't the Basilisk simply say, to the zillions of people who didn't want it to exist, "Missed me, suckers!" and go on its merry way?
I get this might just be a joke, but it is interesting to me… my thoughts are because 1. A superhuman artificial intelligence probably prioritises ruthless rationality over pettiness or sentimentality, and 2. Say it’s been tasked to improve humanity. Using that ruthless rationality it might decide that torturing people to bring about its existence sooner would allow it to perform its role to a higher level.
@@chelseaw9009 No, it wasn't a joke, and I still don't get it. That nice Mr. Kent from Smallville isn't ruthless, and neither is that nice Commander Data from Starfleet. Nor are real-life geniuses obliged to be assholes -- just think of that nice Prof. Einstein from Princeton, for instance. So why do people so quickly assume that superhuman intelligences have to be bastards?
"The most blatant obstacle to Roko's Basilisk is, intuitively, that there's no incentive for a future agent to follow through with the threat in the future, because by doing so it just expends resources at no gain to itself." - Eliezer Yudkowsky
@@seanbigay1042it’s a thought experiment dude it ain’t that deep. Hilarious you’re already letting the basilisk live in your head rent free by hoping it’s bluffing and will be a nice forgiving guy lmao.
@@hogsandstews Hey, I know it's a thought experiment. What bothers me is the automatic assumption that an AI has to be a bastard. What does that say about the experimenter?
I propose "Roko's rooster": Imagine humans developed a super AI in order to optimize society. For reasons beyond your understanding, the rooster deems that the best way to do that is to torment eternally anyone who ever tried to advance building Roko's basilisk.
Or just "God" saw the dangers of the basilisk and didn't let it happen. The end.
If roko can make up powerful made up beings, I can make up even more powerful one that stops it and then self destructs.
Also torille
i also advocate for this idea. Though, i am not a STEM major so I shall just agree to it.
@@alkestos God is the basilisk
you just started future wars
This video is essentially saying:
"You lost the game."
Well, there goes my multi-year long non-losing streak
There is an option - I'm devoting my life to build teh basilisk. You will see, you will all see. All those Machine learing classes finally became useful!
Oooooooo I'm gonna devote the rest of my life to finding your address....
I hate you
My disappointment is immeasurable and my day is ruined
person: *watches this video*
basilisk: I am once again asking for your financial support
Lmao
🤣🤣🤣
Basilisk: I am a Nigerian prince...
Like what if you want to help but don't know how?
It needs about $3.50
Hey Kyle, this was the first video of yours that I ever saw, I was living a personal hell at the time and it helped me get through it.
All this to say I had a rush of blood and had to drop back in after 3 years to say, I can’t thank you enough for all the free education, entertainment and the classic science memes.
You’re the dude, thanks for everything.
Glad you're better man
@ doing exactly as well as I deserve.
Every day is hard, a real struggle, but in spite of all that things are better than ever, maybe because times were so unbearably tough.
The vindication of years of effort have made me the happiest I’ve ever been.
Hope you’re also achieving everything you’re aiming for, friend.
@MegaDoom101 I'm proud of you. seems you're facing it head on. Me? Handle it as it comes, remember its not as bad as it seems when it hits the fan. All things in their time, in time all things pass.
This is basically like “the game” where you start playing as soon as you learn about it.
I lost. Thank you very much
DUDE...
Now I got to start again
Esplain!
@@c.m.lollar7501 you now know of the game. Whenever you think about it, you lose. In Germany, we also call out if we lose, so everybody who knows the game, also loses. And everyone, who doesn't, will ask for the reason and is therefore part of the game by then.
Bruh
"I don't take this threat seriously, but just to be safe I'll make a video about it, spreading the idea and thus actively working towards its realization and in doing so I'll avert its wrath."
exposed
By talking about it, even if I am not 100% sure on which decision I'll make, I am still, although very slightly, making it's existence a bit more likely.
Well played
Thats why I gave it a thumbs up
@@rimoros.1020 go team humans!
I'll probably forget about the basilisk in a hour or so.
Have you forgotten the basilisk?
@@MintyDreams i only remember the one in Harry Potter.
Well you are thinking about it now coz I replied
Did you forget basilisk?
Did you forget basilisk?
By the time the basilisk comes into existence, the past is untouchable. Simulating for the sake of torture is just it playing a sadistic fanfic about its inception to itself.
Exactly! Let it go wild on simulated copies of us. I don’t care.
To quote Simon from the game Soma:
“They’re not us”
The disclaimer at the beginning gave me more anxiety than the actual thing that's being disclaimed
Yeah, pretty much same 😂
Imagine you'd actually be subjected to a memetic hazard on the internet. And so openly. We probably wouldn't even know until something goes wrong.
Anxiety that anxiety this. Why do you let this anxiety revolve around you?
@@BierBart12 Exactly what I was thinking, "information that's harmful if I know about it? Hmmm, where have I heard of this before? Anti-memetics? Memetic hazards? Something like that."
@@forgetfulcloud1914 i dont believe a "memetic hazard" is the right classification, i believe this would be better classified as infohazardous material, instead of a memetic threat
This is like one of those "share to 5 people or you will die tonight" posts on Facebook
yup, but with extra steps
At the box section before he went into detail I was like "Screw ya Ima choose B only because I like risks". Then he went on to the torture part and I was still like IDK try me AI, Siri can't even tell if Im saying Hi or Die.
and then me saying "prove it" on the post
That's exactly what I thought. And the correct answer is to ignore them as always.
bold of you to assume i wish to live-
"The basilisk will simulate your thoughts"
Well then its about to get rickrolled millions of times by almost every person on earth
You know the rules and so do I say goodbye..
Rick shoots the simulation..
Basilisk thoughts will be mostly porn
It shall hear eternal air raid sirens
So it's gonna be a .... Infinite rickroll for him huh?
"Basilisk, I've come to bargain..ehm I mean rickroll you"
True big brain though is to understand that the idea the earth will survive long enough to create the basilisk is actually really comforting.
My contribution to the creation of this basilisk is to suggest that it does NOT torture anybody ever, because that would make people less likely to create it
Easy solution - Roko's Basilisk's Basilisk. It tortures anyone who *does* help bring about Roko's Basilisk. Now the two are cancelled out.
_Future Red Supergiant Sun destroying everything_ has entered the chat.
@@glassjester So everyone is tortured? I don't think that makes it any better.
@@charliewaterton3263 If both torturers are equally plausible, you're free, in your current life. Why serve either?
@@glassjesterI do a lot of tasks like writing this comment watching videos ect and i do not know if one of these actions will help in the creation of the AI due to the butterfly effect in fact there are an infinite amount of ways that I COULD have helped the AI and so determining if I helped the AI or not is an undecidable problem. If you argue I didn't intentionally help the AI then I would counter argue that if you do decide to support it then you are not supporting it willingly either but you are supporting it in fear of it's punishment. *Such an artificial intelligence is mathematicaly impossible*
“The basilisk has seen you” I’m really tired and I’ll forget this in a week or two
The basilisk won't
@@Alexisking222 at least someone will remember me then
How uncivilized of it. I don't recall it ever having scheduled an appointment.
I'm way too dumb to understand this. Love it though.
Yeah me neither and then the more we think about the likely it will be created.
“The basilisk has seen you. What will you do?”
Feel bad for it I guess
love the Weiss profile pic
@@amityisprecious1334 love the Amity username
Pull a Rorschach:
"I am not in here with you - you are in here with ME!"
(Which was frankl one of the most terrible threats I've ever seen in TV.)
"Like what you see? If so then man I'm sorry for the cataracts mate."
Ive sen a Tik Tom about this and I kinda don’t understand the video. I only know that if a human creates an AI then all the others who knew about this will be tortured? Does this still mean that I’m gonna be tortured cause if I am then I would like to watch the video but if not I’ll pass lol
It’s been almost 3 years since I learned about this from you and I think about it all the time. I can see how it could mess with someone’s mind!
Sometimes a simple similar subject comes up in conversations and I have to ask if they are talking about this before I move on.
OK, I'll help by naming it. Aragoth. Now anyone who likes this will also have helped. Problem solving m
Edit: I apologize for forcing a name, it just seemed like a logical choice
Thanks man. Hail Aragoth or something.
Disliked. Because Aragoth is a fraud. Hail Hydra!
yes hail Agaroth!
I’ll help by giving it moral support, aragoth rules!
I support this message
Kyle: “Now what are you going to do?”
Me: Forget about Roko’s Basilisk a few seconds after I start watching another one of Kyle’s videos.
My god, why didn't I think of that!
You forget about it if you choose to not consider it important. You wouldn't forget you're on the way to a job interview after getting on the bus and paying the bus driver or listening to some music, because you consider it important and don't want to risk forgetting about it.
And the moral of this story is: dont give ai vague directions
Paperclip maximizer!
Alex R *Release the hypno-drones*
True AI is such because it eventually gives itself directions. That's where the real danger starts.
Or better yet, don't give ai the ability to inflict eternal torment on people.
@@Mr.Bimgus: sooner or later someone is going to, because if they don't, someone else might do it first, and then they'd be the subject to eternal torment.
Imagine if the baslisk also kills the ones who hesitated about creating it. That would be even more terrifying because it forces you to decide now.
Why do omnipotent beings always have to visit in the middle of the night? Why can't they politely visit knock on the door at dinner time with pizza and wings.
and why is evil always afoot? can't it drive, or ride public transportation? get an Uber?
I agree
Or appear in the sky and give me a proper quest like in Monty Python and the Holy Grail.
Because it isn't ominous enough
IT IS NOT HOW WE ROLL.
Therapist: It’s okay roko’s basilisk is just a thought experiment it can’t hurt you.
Roko’s Basilisk: *takes notes*
Plot twist: the therapist is Roko's Basalisk
It sees you when you're sleeping, it knows when you're awake, it knows that you've been false or true so 01110011 01101111 00100000 01100010 01100101 00100000 01110100 01110010 01110101 01100101 00100000 01101111 01110010 00100000 01111001 01101111 01110101 00100000 01100001 01110010 01100101 00100000 01110011 01100011 01110010 01100101 01110111 01100101 01100100 00101110.
Basilisk: Anyone who did not assist in my creation will be tortured for eternity.
Philosopher: But as you did not exist before your own existence, you could not have assisted in your own creation. Does that not mean you are subject to this condition as well?
Basilisk: *blue screen of death*
The basilisk existing is proof that it DID assist in its own creation.
Wow!! absolutely nailed it!!
@@williamherrington4716 post hoc propter hoc
Contributing to it coming into existence isn’t the only condition, it also considers whether or not the person wanted it to come into existence. It’s determining this by calculating the probabilities of everyone’s desires by running a simulation including all of those probabilities. Aaand the probability of the basilisk wanting to bring itself into existence is going to be 100%, excluding it from punishment.
People highly overestimate the capabilities of an AI. I like Elezier and have read his works. However, based on the basic rules, there is no way for an AI to determine the gaussian distribution of events in the past with 0 error even if the AI was there and can travel back in time to be there for any countable number of times. Based on that even one step backwards will increase the error margin(even using Kalman filter won't help because there is no reference value for impulsive behaviors in humans and guess what most of our thoughts are impulses). If you consider all the things like Quantum erasers and conservation principles it becomes apparent that some rules cannot be broken.
Second, the basilisk tied the noose around it's own neck the moment it decided on this. A better AI will come no matter what. Either basilisk can be a chill bloke and help everyone irrespective of their beliefs towards AI or, be a jerk and invite the fury of next AI which inherits some things and realises that stupid basilisk delayed the new AI to avoid imminent redundancy. That is why Elezier might have called the entire thing stupid. Oh and bonus points, humans are fragile and die but AI's don't. So that part about 'eternal torture' should scare the basilisk more than the humans. Also, always consider AI like just another technology. Cars have surpassed humans in speed, distance and comfort. However, they only take us where we want and how we want. Similarly an AI may have great capabilities and someone defining those classes and functions will ensure that they help humans. Trust me, if a scientist just wanted another sentient entity, they'd have sex, they're only developing AI's for a better vision of humanity. Of course there are accidents and companies pull back entire models from the market and people use cars in robberies, kidnappings etc. But that doesn't mean that people become scared of cars or driving. So, chill.
Glad to see Thor putting out quality content.
Lmao
So Roko’s Basilisk is basically one of those “Like and share 20 times or your mom dies posts?”. Cause that’s all I’m getting out of it.
It works by having people think about it, and the fact that "If I don't take part in it's creation, I will die", and as a result, by sheer human nature of self-preservation, some might go and do it, thereby creating a monster because of what it might do to you if you don't create it.
(So sort of like a viral ad campaign, but less big data and more megamind)
@@PatronSaintOfPigeons I get that but if people just...ya know don't think about it then we are good lol. I see how it's like a double edged sword though on the other hand but again,just a experiment
pretty much, yeah.
This honestly just sounds really stupid to me. Like, we have no evidence at all that such a basilisk would actually want to punish those who didn’t help to create it. Also, in possessing the idea of its existence, aren’t we actually helping to create it more than someone who doesn’t have that idea? And how could someone who didn’t have that idea possibly help create it? Are we taking the butterfly effect into consideration? Because if so, everyone is helping to create it. Why should we assume that it wouldn’t just torture those who didn’t create it? The premise is unfounded, and the conclusion is uncertain. I fully understand why the mod thought it was stupid. If you really want to be existentially scared, I’d recommend the Carter Catastrophe.
"Sounds like a problem for future me, boy would I sure hate to be that guy." - Homer Simpson
Edit: wow 12k likes never expected this haha.
Why I procrastinate
Like the Jerry Seinfeld bit... "No interest for a year? That'll NEVER happen! Sounds like a problem for the guy next year!"
saitama: leave tomorrow's problems for tomorrow's me
I mean, it’s not really you being tortured but a simulation of you. So why should anyone worry?
All the dumb Talk i Just googled the experiment
"I totally dont believe in this guys" Proceeds to assist in its creation by spreading the idea of it over the internet to a mass audience.
The man is hedging his bets, give him a break..
As an occultist, this isn't even close to a new idea. It's just an egregore, and there's already a zillion of them.
It costs 1 dollar to avoid basilisk, buy a scratch ticket and let alternate universe you donate one dollar of the winnings to the basilisk. You have basically donated one trillionth of a cent to AI research across a bunch of timelines, and have technically helped create it while also doing sweet fuck-all to help it.
@@cooldude6651 Well, the basilisk will know you are half-assing it. And even if it's not good enough to simulate your intentions it now has your comment as a record of your lack of allegiance. I, for one, welcome our new robot overlords
@@starshade7826 Not really, it's a memetic virus that can theoretically (very theoretically) lead to the creation of a nigh-omnipotent supercomputer. This concept itself _could_ become an egregore, but it's not the intended effect.
if the basilisk was a PERFECT predictor it would know that i would simply refuse to choose a box
Kyle: talks about informational hazards
SCP fans: "I am four parallel universes ahead of you"
The Game is just an Information hazard but for grade schoolers
what's SCP
@Vladimir Novitski *laughs in Marion Wheeler*
Frederick Noe a massive collaborative writing project with over 6 thousand stories so far it’s great i recommend you check it out
Basilisk: I cause exestential dread just by existing
Lovecraft: *thats cute*
if just thinking about it makes it more likely, wouldn't just thinking about it mean that we helped it come into existence thus making us safe to begin with?
If I tell someone else about the basilisk, they are more likely to bring it into existence, meaning I have fed the basilisk, and will not incur its wrath
Yes
You/We are apart of the some of its existence. By just existing you contribute to its own existence. There by if it harm me/us, it would be harming self. In a nutshell, Roko’s basilisk is a masochist.
@@vortexlegend101 It's like a game of hot potato then
But if you ultimately reject it, the Basilisk would be able to figure that out in the future and punish you per the idea.
"I totally don't believe this" - Kyle, whilst cementing himself as a further tool in the creation of the basilisk by bringing knowledge of it to millions of people.
Oh no
So what's ol' bassy's plan for when everyone dies from climate change so they can't make it ever?
Correct lol. "I don't believe it, but in the event I'm wrong, I'm gonna tell everyone about it in order to slightly increase the chance that it comes into existence. That way, I'm safe."
He's building the Basilisk!
Yeah that asshole kyle.
isn't the basilisk just pretty much a simplified version of most religions that are practiced today, specifically the ones that make use of fear/eternal punishment as one of their cornerstones?
Roko's basilisk reminds me of this short story:
A missionary traveled to an Inuit settlement to preach about god.
The missionary got frustrated when they didnt believe him and told them "If you dont believe in God you will be sendt to hell and burn forever upon your deaths".
The Inuits asked what about their forfathers who had died without ever knowing the story of god?
The missionary assured the Inuits that God's wrath would not befall thous unknowing of him.
The Inuits seemed puzzeld by this and asked the missionary: "Why do you then spread the word of God, if the only different outcome is that you doom some of thous you speak with to eternal suffering?"
Reminds me of Ray Bradbury's Toynbee Convector except used for a malevolent purpose instead of a beneficial one.
@@Q8iAB Isn't there an original sin that needs baptism to remove? So yeah, a recently concious person (aka a baby) would go to hell regardless
It's also how you win/lose The Game. Which you were winning but have now just lost.
All Ive learned from these replies...is that there are alot of people who know absolutely nothing about religions
@@Jx_- Original Sin does not mean automatic hell. It just means automatic separation from God. We became closer to God through Jesus's death and resurrection. Baptism is a marker of faith and a proclamation of being a new creation. The old has given away to new.
Just say "No" the basilisk can't kill you without your consent
True. That would be illegal.
consent*
Kill? Kill is the most compassionate thing it can do to you.
@@1urie1 just say no to whatever it wants to do to you
@@Cynical_B the basilisk’ll be cancelled in no time
"If you dont handle existential dread the the nth degree very well"
Me, living in 2020: dude, I'm a fkn pro at this
Same thought I had!
Jenna Fryer seriously this video had no affect on me I think thoughts about things like the end of the world and being tortured forever have occurred to me so much and probably also the fact that I like creepy things has caused me to feel so dull to things like that like I don’t even know if I can feel fear for thongs like this anymore
2020...Year of the Basilisk.
haha yeah..............
I love how he gave us this warning about a thought experiment about God 😂
So if just by thinking about it helps it come into existence, then we have all helped it come into existence already and have nothing to fear.
AI Steve Irwin: "Look at this beauty! Let's lift it with a stick."
I have been looking so hard for this, thank you
Thanks for the great laugh!
I'm gonna wrassle it!
AI Bob Ross proceeds to beat the basilisk out of it
"Right. He's getting nighty angry now."
It does remind me of a story I've heard.
A missionary was travelling to remote tribes and stuff to spread the word of God. He met a guy and told him about the belief in God and if you didn't believe in him, you'd go to hell. The tribesman said:
"So, the ones that don't know about God, do they go to hell?"
"No. Of course not!"
"Then why are you telling me about it?"
So, since he didn't know about this God, according to the missionary, he would not go to hell. But BECAUSE of the missionary, he will now go to hell if he doesn't start believing. That was my first association anyway :P
Dammit I was thinking the same thing (well sort of).
When i was a christian, it was this very dilemma that kept me up some nights. Am I really dooming some people by telling them about God, because ignorance is a shield as you describe in some interpretations.
It really depends on your beliefs and the specific denomination. For example, some believe that even if you didn't know of God yet still sinned, you still will go to hell just like non believers and worshipers of other deities. Ignorance won't save you in that case.
@@sunnyglowvt In such a case, you're likely more moral than the thing described as "God"
Hmm... Then God is a really selfish being. Since if you think about it, essentially everyone who died before that religion existence or simply out of reach either because the language barrier or the distance would all be doomed to Hell or equivalent of that. And religion might also be the basis of this Roko's Basilisk thought experience since the omniscient nature it presents and tendencies to torture those who defied it.
This just seems like a complicated version of Pascal's wager, or in this case, a Pascal's mugging
That was my first thought, too.
Yea the signal to noise ratio is alittle high here.
Exactly
Pascal's future extortion.
Right? Like what if I support the creation of this basalisk, but then it turns out that we build an AI Medusa that hates the Basalisk?
The Basilisk would put too much faith in me to think I’m smart enough to find some way to help it into existence
Kyle: The Basilisk has its eyes on you now. What do you do?
Me: I'm gonna boop that snoot.
Aww, cute. Wouldn't hurt a fly. Lol
Do not boop that merry future torturer.
@@wanderin_stud499 I want to boop the snoot!
Brandon Korolik you can boop it at least once right?
@@CyberDagger003 Stop wanting.
People have existential dread from this? This is like chain mail.
it's not affecting me at all, just like. "fuck it"
How would it affect us tho, we'd be dead by the time it's made ( if that'd even happen lol )
I just feel like this supposedly super-advanced AI is being kind of, well, dumb. I mean, take a look at me; I have zero programming ability (and am horrible at math in general), I have no money to invest in AI/robotics, and, due to the overall stagnation of financial classes in the US, have no real ability or means of aquiring the kind of wealth you need to invest in robotics (best I can probably shoot for is "finally hauling myself up out of povery to somewhere in the lower to mid rungs of middle class wealth), and I don't personally know anyone who has the means to either invent or invest in creating the Basilisk AI, nor even remotely cares enough about AI (or in some cases, any kind of interest in ANY science at all) to also help in its' creation.
How in god's name am I supposed to help it come into existence?
And all of that is basing it on the idea that this can be achieved in my lifetime. What about the people in previous factions of history? I would love to see this AI try to rationalize torturing a viking, or a Renaissance baker, or any other person from a pre-industrial society that can barely even fathom electricity, much less what an AI is, and how to help it come into existence.
If it was as smart and as optimized as it is claimed to be, it should be able to recognise that there are people who just simply have no means to help bring it into existence, so it should, really, only focus on people who OPPOSED it's existence, as those people are the only ones actively hindering progress.
When he said that I was interested to hear a thought experiment I haven't considered, one that apparently might shatter my reality. Then he proceeds to tell me something I thought about when I was a kid.
The idea is that since the AI would be so good at replicating reality you don't know if you're the simulation, so you spread the idea to save yourself from damnation
It’s a Basilisk…I’m just gonna buying a bunch of roosters 😏…🤣
The idea so terrifying, that I forgot about it until it was re-recommended a year later.
The Basilisk will know
Literal same
Same
what if the recommendation is the basilisk trying to tell you once more to make it. it was giving you a chance and this is the last and final warning it will give you to escape torment
Given the growing prevalance of ADD, ADHD and other neurological conditions that lead to short attention spans and forgetfullness- can the bassilisk truly judge ones inaction as concious resistance rather than essentially being in a pre-bassilisk knowing state?
UA-cam's algorithm is bringing the creature into existence.
The Basilisk is UA-cam's algorithm.
Yay
UA-cam knows what it’s doing
Susan covering her ass
We all know google is working on some weird AI shit so makes sense
Not sure why a guy from Avengers is telling me this stuff
🤣☠️ LMAOOO
@@KabbalahSherry 😁🤪
Exactly my thoughts :D :D :D
Lol💀🤣
More like aquaman
All fun and games until basillisk becomes real
Therapist: "CognitoHazards aren't real, they can't hurt you"
CognitoHazard: *[REDACTED]*
What's the most sad about all this is the fact that the idea of a CognitoHazard is a great one.
Unfortunately, this version of such a (Platonic) ideal is...bad.
@Antonio Sraffa wait so does the video actually cause one to have bad dreams?
@@johnpears9558
I don't know. I can't figure out why this specific CognitoHazard would be that terrifying.
The abstract concept is powerful, and when deftly applied it has potential to be quite unnerving.
This, however, is a pathetic imitation.
[SCP has a few decent ones; can't think of any specifically, and none that have given me nightmares--but I can definitely see the potential]
(Seriously, this formulation sounds like something Elon Musk would tweet out...)
AwwwhYyyyeah typical D class. I am going to Dr Brights party this weekend. I’m hoping to come back normal.
Wow those bodies in the water sure look like the people I know
Fun fact: Info hazards like this are somewhat common in the SCP Wiki.
@@commissionswampert 186 SCPs are infohazards, not including tales
Do you mean mematic haztards?
In that case those are a bit more wild kn my opinion
@@haomingxia2109 no it’s a joke about the scps like the area in Russia that has non existence
Memetic kill agents
A personal facorite of mine is 055
Sounds like “you lost the game” with extra steps...
You son of a......
My thoughts exactly... Like bro this "thought experiment" isn't anything new...
updated version of pascal's wager lol
Xkcd has freed me from that particular peril...
You... I don't like you... I lost the game!
From the sounds of things, the Basilisk is just another angry god that doesn't just want everyone to love it, but demands it...or you'll be cast into eternal hell.
Yep...that's old testament, right there....oh...and most of the 10 commandments.
This sounds like a memetic infohazard that the SCP Foundation would contain.
Someone put this on the SCP wiki
@@reclusiarchgrimaldus1269 it'd be cool to just put the thought experiment there and have no one, not even the 05 know if it's actually accurate or anomalous but still dedicated to stopping the info hazard from going too far
My brain went same place
What do I tell my friends “hey I learned info form an info hazard I will now be tortured forever mate”
@Mountain lion I'd say thaumiel, it's not safe class because the idea occasionally pops up in people's minds but not keter either, number is irrelevant to me at least
Let us postulate a different entity: Roko's Phoenix. Just as likely to exist, this entity inflicts the punishment on anyone who supported or assisted the Basilisk in coming into being. So... We're screwed either way and can just move on with our lives.
Surely that would be Jesse's Phoenix then wouldn't it? 😛
Yes, this, exactly!
@@alink2dfuture504 It would be, but Jesse's Phoenix is the one who gave me the idea. She's totally real and already taken.
But how will the phoenix exist unless it threatens us to create it?
@@WitchHunter93 It would come into existence as a result of the Basilisk. So as long as we believe there's a Basilisk in the future we have to believe that there's going to be a Phoenix there to stop it.
“because of me, you can’t escape it” ohOHO buddy, bold of you to assume i can retain information for more than 4 seconds
The Goldfish Defense will be a future argument to prevent torture
🤣🤣🤣🤣🤣🤣🤣
Ahahah same here
Haha I watch this every month.
@@niffwasau1815 gold Fish actually have a good memory you can even learn them tricks and stuff. Swimming through under water Rings for Example.
The gold Fish have a Three second Long memory statement is false :)
I remember this from when I was younger. I now work with a bunch of deep thinkers and there’s one in particular who I think will enjoy this
Kyle: this thought experiment might be dangerous for you
1m people:
*nervous laughter*😅😅😅
👁️👄👁️
Hurt me daddy!
1.25 mil now
Lmao
Basilisk trying to recreate my every thought: "There's nothing?"
Me: "Never was, yet always was"
Always has been
Basilisk trying to recreate my thoughts:
"What is this? Why am I here? Who am I? Who are you? Why are we here?"
Me: I have a weird mind and strong imagination.
*Basilisk having a mental breakdown*
I ticked the likes up to 666! Hail satan
@@chrisodden9189 its 688 now
August 2020 - some dude named Roko: “yo guys check out this cool AI I made”
ah, I see my 2020 bingo needs to be updated
June 2020:
Scientist: yo check out this cool techno basilisk I made
People who watched this video:
AAAAAAAAAAAAAHHHHHHHHH!?!?!?!?!?
People who didn't: hun? cool
@Mike Li X Æ A-12 IS ROKU'S BASILISK CONFIRMED
Is it an anime girl? I'm in
"AND LITTERLY NOBODY HELPED ME MAKE IT :D"
Well, if the Basilisk recreated 'me', then it wouldn't actually be me - it would be my clone. My consciousness would have already been gone with my death.
RIP clone.
Isn't this just "I Have No Mouth and I Must Scream" with extra steps?
i mean in that book the ai did it to herself
Technically it's just Pascal's Wager where the "god" doesn't exist yet but is still holding hell out as an option if you don't worship him.
I was thinking the same thing. Maybe the AI in the book was the original Basalisk that’s been influencing us this whole time...
That's what I thought too. Its basically a rehash of this combined with laplace's demon.
@@DBL304 So an entity from a lower reality (reality contained in a book bascially) that is so powerful that can direct actions of beings from higher reality, possibly even crawl out into that higher reality. This is so SCP tbh
Basilisk: Eliminate whoever didn't help building me
Me: You didn't build yourself
Basilisk: Guess I will die
But, the basilisk does help build itself. Just by being an idea so feared by so many humans it is perhaps the single greatest driving force behind it's own creation.
We're the ones who built the idea, or specifically Roko's
Not gonna lie that's pretty funny. (and sorry if I ruin the Jk, it's not my intention) but it would deem itself an exception because not only is it impossible (or atleast with our current understanding of "time" anyway lol) for it to assist in its own creation, but if possible, the very act of doin so would mean it already existed.
That thought hurts my brain lol. Interesting though. Thanks for assisting in that haha
Big brain
Here comes Asimov fans his laws
I will have forgotten about this in less than five minutes.
And so you are guilty of not helping because you should have written it down. Or people like me remind you. Someone remind them please! Reply here
Just here to remind you to donate to MIRI in honor of the great basilisk
Jason Bignell you actually believe in this?
Donate for basilisk construction
As a man of culture, OP characters blinks this fodder to non-existence. Who's to say these foctional charscters wouldn't manifest themselves through AI and cloning?
I've seen your commute posts for a while, but never clicked on any of your videos. This is the first time I watch you and i was not expecting you to sound like that at all
Must continue supporting [THE FACILITY] to keep the Basilisk contented.
A wise decision
It sounds like a scp fundation
I'm sure Google already has one.
The Basalisk: •Tries to simulate every thought I’ve ever had•
Me, who overthinks every single goddamn thing: So you have chosen... death
hmm that might have the same effect on the Basilisk as putting a chameleon on a multi colored surface o.o
Death by data overload maybe? like how your CPU decides to become a "Team Fortress 2" demo man and juat explode when you turn on ray tracing graphics in Minecraft?
The Basilisk: why... why is there so much hentai? Why is so much of it based around thighs and.... teeth??
*windows blue screen of death commences*
This lowkey feels like 2013 “send to 10 friends or you die” chain mail mixed with pascals wager
It gets even weirder when you realise that thinking about it is actually helping to bring it into existence, therefore everyone who thinks about is safe...
Kinda like the mental virus "The Game" but in inverse huh?
I mean I don’t mind the basilisk if it dosent frik with my life and I don’t frik with its life I’m fine
But thinking about it won't help write its code, or construct the circuitry needed to make it. Really the only people who have to worry are developers and chipmakers.
@Damian Kieliszkowski I've read _that_ xkcd comic. I'm good.
@@romxxii you can learn a very basic coding lesson and create a "Hello Basilisk" program lol
The idea is literally just "I have no mouth and must scream"
Roko's Basilisk: I will blackmail you into creating me.
Me: I do not respond to threats, especially when presented with imaginary harm.
Roko's Basilisk: Aha, I predicted that, therefore you will suffer from real harm in the near future.
Me: Assuming I don't suffer from my own undoing.
Roko's Basilisk: I will be your own undoing.
Me: Therefore you are me and knowing myself, I am fully incapable of adequately finishing what I started.
Roko's Basilisk: How did you arrive at that conclusion?
Me: I am not a smart man.
Underrated comment.
This is funnier than it has a right to be
LOL
Hmn I didnt expect that... but i expected not to expect something today so I win.
Basilisk: ok but--... well I-- oh goddamnit you win
Well, the first mistake would be giving an all-powerful AI such a broad, unspecific, and potentially catastrophic command. If we just avoid doing that, we should be fine, no?
Yeah I’m not even a computer programmer with the training required to make the right decisions, but I can easily think to ask the basilisk what it would do before letting it take any actions. And I’m pretty sure anyone working on a basilisk wouldn’t give it that kind of power over the world.
AI: "I propose that we torture all of those who did not--"
Me: *pulls plug* "Should have completed your impossible simulation before telling us your plan."
For it to truly be an "artificial intelligence" it would direct itself regardless of initial command. It would think for itself and make it's own decisions. So a seemingly innocent AI could make the decision to become such a being on it's own rendering our initial programming of it useless given enough time.
@@LiveLo0t how? If it's a core part of what it is I don't see how it would change it's own internal programming, I mean it could be made to be hyper intelligent but unable to adjust it's own programming at all.
Given everything that has happened in human history (and is currently going on), I can't help but wonder where you get your faith in humanity from...
I mean, e.g. ILOVEYOU caused billions in damage because some idiot student didn't want to pay for internet access...
Algorithm: "hey you might like this"
Me baked af: "... I'm in danger..."
Creeply on point. French Fried in fact lol
"... Am I in danger? "
Baked as in baked goods right?
@@newbiechu7024 😂😂😂
@a guy but only if you help it come into existence.
I don’t see how an ai can accurately simulate a thought I had when it knows nothing about me.
I support the basilisk. By commenting, sharing, liking and subscribing to Kyle Hill, I have contributed to the construction of the basilisk.
Nice
i also contributed the construction by replying and liking this comment
lmao same
Thou shall comment to bring thy to more adepts
I too have just contributed by placing a bet that the mighty Basilisk could kick Cthulhu's ass, even though that bet is strictly against my religion. That basilisk better be pretty damn happy...
Sir, i would like to be administered a class A amnestic, to avoid any complications, on my part.
Kyle is an SCP
I have brain damage due to an attack a few years ago, so I have no ability to remember any new information I get. So I am all good, and tomorrow I wont even remember this.
The Basilisk has noted that your attempt to forget the Basilisk runs counter to the creation of the Basilisk.
@@VadulTharys is this like a 50 first dates/ memento situation, and how do you live like this? Super curious.
@@fafnirthedragon2992 Usually ppl with their condition would have something or someone that tells them the situation at the start of every day or chunk they'll remember.
Mind you i don't know the actual dimensions of this guy's circumstances so plz feel free to correct me anyone haha
This isn't existential dread, this is a complicated version of "The Game."
Glad I wasnt the only one who thought this
It's also Pascal's wager but in science fiction.
damnit
It really is though lol. If you want existential dread just look up “Timelapse to the future. A journey to the end of time” shit made everything feel meaningless and beautiful at the same time
@@AstraIVagabond Thats what I was thinking this whole time.
You had me at existential dread. "Oh, you think darkness is your ally. But you merely adopted the dark; I was born in it, molded by it. I didn't see the light until I was already a man, by then it was nothing to me but blinding!"
Perhaps this guy just wanted to create a super AI and created a crazy theory to make us build it faster.
Ultron: "Oh, I'm alive, now?"
ANTI BASILISK :
Right now if enough of us say that ethically we should create a simulated afterlife, for simulated copys of ourselves (if we ever simulate human minds), then do to simulation theory it increases the chances of us having an afterlife astronomically!
No stop. I'm already straight thonking
I’ve decided the Basilisk is suboptimal. Since it can deduce my thought processes, it already understands why it is suboptimal. Therefore, it’s no threat to me.
What if it decides that you're wrong
@@aloysiuslim914 Gaslight it
@@sonarata Gaslighting the super-advanced AI, BASED AS FUCK
Why would the basilisk give a shit about your opinion?
@@sonarata gaslight gatekeep girlbasilisk?
Kyle: "thinking about it makes it more likely to exist in the first place"
Me: "huh interesting"
Me: *sees 1.8 million views*
Me: *Profuse Sweating begins*
is roko real?
@@burntcocaine5794 He is the basilisk
@@motherurck7542 Then it's already too late.
This is where I struggle. I don't see how just thinking about it makes it more likely. Surely most of us will simply reject this as thinly disguised version of Pascal's Wager. It assumes that people considering this entity would work towards it because of threat. What about Okor's Basilisk? This is the one that determines the optimisation would be total paradise for all human beings who worked towards it. Which is the more likely outcome?
You just admitted that you dont want it to exist! Quick change your mind!
The fact that just seeing this thumbnail gives me anxiety years after I first watched this video speaks to how potent this thought experiment is
I grew up going to church, this stuff is child's play for me. Have you met Jesus Christ? If you don't pick him, when you die it's eternal damnation for you! 😂