I’m surprised the institute never experimented with creating a settlement completely comprised of synths unaware that they were synths just to see how they developed in a controlled environment among themselves to test their emotional growth capabilities and potential sentience capacity.
The institute doesn't want to live on the outside and has no plans to do so in the future, it would be an interesting location but other than that it wouldn't be too lure friendly
@@ysgcentral Yes but if the Institute's goal is to have synths infiltrate humans to help the Institue where needed, having a fully controlled settlement would give them credibility and the ability to farm/gather resources en mass
@@Jakewake52 I never really understood why they would infiltrate settlements in the first place considering they never had plans to live on the outside and even father himself knew what it was like so I didn't understand why they need sense for that I think it was just to have extra enemy NPCs
@@ysgcentral If you have spies in a settlement its not only easier to monitor their goals and movements, but you can influence them if they are planning on doing something that goes against your wants/needs/goals or make requesition of resources they need more redially available. Like they see synths, the commonwealth is just a dangerous tool to them and having the synths bodysnatch gives them trust and credibility and the chance for a quick quelching of any reditance to the instatute
Not really. Kellogg was a violent shell of a man who only worked for a paycheck. Probably the point of having that mnemonic impression, after having just seen his life in brief. Remind you of that.
@@a.monach7602 I mean that's a bit ungrateful to Kellog too, he's still written pretty well his backstory definitely gave him more character and I would love for it to have another Lilly and Leo scenario. And it with the addition to far harbor's lore about synths having limited memory space having another person's memory inserted in an already existing memory of an entirely different person would probably have some conflict and maybe a come back from kellog forcing you to either fight nick or find a way erase Kellog for good or find away to help Kellog make peace with his past by exploring his memories some more. The possibilities are endless.
Agreed. I can think of a few ways that would have been implemented but I also recognize ol Nick has been through enough and he's a reliable companion. A Kellogg sub story could shatter that faith we have in Nick who's basically one of the most human characters in the game. At one point he asks "hey, how are YOU doing" and I think he's the only character in the game that actually asks that. Piper sort of tip toes around empathy in her interview but blows it calling you "Blue" which probably hits pretty hard. His "Blue" wife is still decomposing in the Vault, Piper!
The Institute did have records on Danse, or M7-97 - specifically, he was on a list of synths that had gone missing. The implication being that M7-97 was one of the synths that was freed and had its memories erased by the Railroad. The newly named Danse was moved to Rivet City, where he would join the Brotherhood of Steel.
@@secondBAR If he signed up between 3 and 4 then why wasn't he in 3? All of the other characters that reference being in the Capital Wasteland during Fallout 3 are in Fallout 3, except for Danse.
@@miloknight585 I think it's just because Bethesda can't account for every factor especially from the age of the games and how long the separation is? Just because something doesn't show up in 3 doesn't mean it didnt happen and you can't expect it to y'know?
This was something that always bothered me about the Institute. They make human cyborg clones essentially, then wonder why they act like humans and don't want to be slaves. They have the ability to alter and control their minds, but never utilize that to make them more manageable subjects. They could have created them without the desire for "free will" or emotions that run counter to what the Institute wants. For a group of scientists they are very stupid and narrow sighted. If they didn't want them to behave that way, why did they make them that way?
Yeah I agree. How can the Institute be all about "Mankind Redefined" and talk about how the Synths are the future but then act totally dismissive of these creatures being sentient lol??
Which proves that Institute’s leaders are highly educated idiots, who’d rather create a synth and give it a mop than using freaking Mr. Handys or invent even better robot
@@FlymanMS I always figured the reason they wanted to use gen 3 synths for everything was because they were mostly biological and thus much cheaper to manufacture
I personally wish the game would have made this topic more multilayered. Because from what we can conclude in game, synths (apparently robots like Codsworth too) are sentient beings with autonomous thought. There’s almost no ambiguity to this and that’s why the whole moral dilemma doesn’t work because the answer is too obvious
@@haroldbalzac6336 I don’t think so. I don’t remember, do they mention what exactly gives a Synth away during their questioning? Even then I don’t see why it would be an argument against their ability to think for themselves.
How about Sawbones, the fully self-aware Mr. Gutsy medic in the Citadel in Fallout 3? He hates humanity, writes his own poetry, and begs the Lone Wanderer not to repair him fully, as his faulty programming occasionally lets him inflict pain on his patients. A self-aware sadistic robotic poet.
I would say that yes, synths have free will. Virtually all Gen-3 synths have free will and I would say that also extends to Valentine and DiMA (special Gen-2 prototypes) as well as Curie (advanced Ms. Nanny A.I.). The issue with synths is that their free will is not immutable. It can potentially be interrupted (e.g Kellog temporarily speaking through Valentine), disabled (e.g. Gabriel being shut down with a command word) or even overwritten, evidenced by the various synths who have their minds wiped and replaced with new memories by the Railroad. To me, that is the truly disturbing thing about the nature of synths -- their intentions can never be 100% trusted -- not because the synth is necessarily untrustworthy, but because their minds can be forcibly altered in ways that isn't possible with a natural living being. Humans can be emotionally manipulated and also fall prey to things like corruption and mental illness which can make them dangerous and unpredictable too. However, these afflictions aren't usually like an off/on switch. Human changes in behavior are gradual which provides some warning. Even if a person attempts to hide these changes, certain individuals are likely to perceive the difference especially if they know the person well. But you could be best buds with a synth for years then one day a courser comes along, utters a passcode and the synth suddenly murders you in cold blood without skipping a beat. You didn't see it coming and the synth didn't either. They genuinely were your best bud. The friendship was real--they weren't acting, they weren't running a friendship simulation program, they didn't even realize they were a synth. Their thoughts and feelings were their own...until they weren't.
Physics kinda shows that free will is just more hogwash. Even if it did exist, the rest of the remarks are unconvincing. The immutable reasoning is weaker than weak. Put a gun to someone's head and their free will is interrupted. Free will, if it existed, says nothing about trust. Nor intentions. If free will existed, mind reading still does not, so you know not of the intentions of others at any rate. Free will is just about humans wishing for themselves to feel superior. To pretend you do not exist with a specific form in an eco-system.
Undervalued comment here. Synths aren't humans though ironically Nick is the most "human" character in the game. Objectively you are correct. However, it is human to evaluate this knowledge, combined with the fact that though not human, synths are sentient, therefore it is human to extend empathy and seek to protect sentient life that seeks peace. Luckily for us, it's all so much code we can blast to bits and not worry about ramifications ethically but to actually be there and consider these things...its a nightmare.
@@oisnowy5368 Physics doesn't "show" anything about free will. It's a philosophical question not a scientific one. But your example about placing a gun to one's head interrupting their free will is inaccurate. _Coercing_ someone to do something against their will isn't taking away their free will. The meaning of "Free will" in this context means the ability to choose your actions, not doing what you want.If you point a gun at someone and say "Give me your money or I'll shoot" and most will choose to hand over their money to live. Change the ultimatum to "Kill your son/daughter or I'll shoot" and many would choose to be shot. Humans always have the ability to choose their actions to some degree. You can threaten, lie, persuade or even drug a human to increase the chances they'll do a certain action but at the end the human makes a choice to act (or not) and can potentially choose not to.
@@philw3039 well, id argue that the only reason synths can be controlled and not humans is that machines were specifically designed to control them. One could theoretically design some sort of machine to tickle specific parts of a humans brain and senses to control them too. Hell, we see a lobotomites in new vegas which is almost like controlling a human, just make that technology a little more advanced and less “cutting out major organs” and you have a mind controlled human
On the topic of synth biology: do synths produce human growth hormone (or even other hormones)? If not, couldn't that be used to test if someone is a synth? I feel like the game itself can't decide just how human synths are (and not ageing/growing seems like a pretty big flaw in quite a few cases)
A good way to determine synth from human would be by testing one’s cerebrospinal fluid and plasma. It is hard to create CSF so that would be a good way to test between human and synth
Gen 3s Don't age, cant get fat, but they CAN get sick as the institute doctor can be found testing seemingly drugs on them and in the Terminal where the Synths are made it notes the improving of the synths blood so it clots faster to heal wounds and protects against infections, so if Gen 3s can get infections then they can get sick like natural humans, I think the reason some Gen 3s want to be free, is that after their creation with the basic programming/OS installation they gradually develop personalities from their experiences (with associated emotions) like natural humans, this is supported by how the Institute pick which Synths become Corser's, as they look for synths that display certain personality traits suitable for being a Corser and then they put them for additional training that if they fail then they get reset, which begs the question if Synths are genetically enhanced human robots programmed by the institute why cant they be programmed to be the perfect Corser's instead of waiting to see if they are suitable
I always thought that was a really weird point how people can't normally tell even with extensive testing, who is a synth and who is not yet in the institute. Several scientists say that since don't get drunk and that they're far better at blood, clotting than humans and if that's true, they probably aren't affected by a lot of wasteland drugs either. When you think about it all, these little witch hunt tests would actually be a good way to tell if someone's a synth or not
Just have them eat copious amounts, biologically they can not get fat. Whatever happens to the food they eat...who knows? But weight gain - impossible for them.
Sex robots. They're a bunch of anti social autistic scientists that are afraid of biological women. One of the lead researchers actually has a synth wife.
Guess they were either an easy and more efficient way to mass produce docile workers, infiltrators, and soldiers who will mostly do as instructed and have a kill switch implanted if things go south.
If the Railroad saw Terminator then they would think differently. But since since can't age, need to eat or procreate ( well that I know of) they could replace humanity eventually if they don't need anual repairs.
Wiping a synths memory to save it from being found by the institute kind of defeats the purpose. What’s the point in having free will if your mind gets erased. Curie took over another synths body and no traces of its personality remained after the memory wipe failed. The railroad is prioritizing the physical body of the synth over the mind which is the more important part of what makes a person a person.
Honestly, with the ways some act the argument of Synths having free will could be compared the the argument of if humans have free will (of you buy in to religion and all that, I personally dont) Are Synths just programmed to be that way? Is it outaide influence? Same with people, do we have actual free will or is it all guided by evolution, instinct or some kind of god? I enjoy the idea of the concept of free will. Its a pretty fun topic to think about.
Yeah it does confuse me as well. Humans evolved as a social species and we know that for instance the feeling of love is a chemical reaction in the brain and we are also evolved enough to be creative which were both beneficial to survivial. Having a tribe to watch each other's back and crafting weapons, clothes, and shelter. But also something like having our own preferences, loving a specific person, or building something whether it has practical use or is just for amusement does seem to go with free will. So idk
I find it a far safer bet to go with free will, because at least it gives us agency over things within our realm of influence. Even if this wasn't the case, we would be none the wiser, anyway. There is still the option of a hybrid, i.e. some undetermined level of agency.
Like you said in the beginning, the core question is about the actual meaning of all these concepts/descriptors thrown around like free will, soul, human etc, and in that way the whole discussion is about us as much as it is about the synths. In the game, the topic comes mostly of as "look like humans and display human feelings/ideosyncracies, so we should treat them with human compassion" vs "doesn't matter, you cant proof they have souls, ergo they are equal to toasters". Without questioning what concepts we base this compassion on and why we do that in the first place. I mean, you could also argue we should treat synths with compassion even if they don't have entirely free will like humans (which is debatable aynway) or dont look entirely like humans - the worth of intelligence/life doesn't have to be tied to human-likeness, which is also why we today don't treat animals like they are machines, at least the cute ones we don't eat... Like a lot of other potentially interesting, more deep cutting questions in Fallout 4, this whole thing really remains on the surface in game, sadly. If they only tackled half the topics like free will, the social fabric in a post nuclear world, the impacts on human psyche etc. then Fallout could be an absolute classic of science fiction, medium-spanning, but oh well... theme parks sell better i suppose.
short answer no: they are bound by a string of code that could be tampered with at any moment. causing them to go on murderous rampages or flat out Short circuit. that is not a sentient being that is a programmed machine.
The problem is that synths are basically clones with cybernetics. Kellogg has as much cybernetics as a synth. The Courier has MORE cybernetics than a synth. Cyberdogs, who are 50/50 bio and machine are considered cyborg instead of full robot. So synths with their measly synth component, that can't even be detected until you physically dig through their brain after death, definitely deserve to be treated as more than just tools. Robobrains offer a better moral dilemma than synths.
If the Sole Survivor dies, the Institute could easily militarily defeat anyone that tried to oppose them, but they'd be powerful enough to control their opposition to such a degree that outright military conflict would never be needed. Without any of the factions or people that want to oppose the Institute ever getting the information they needed about it, they could comfortably continue fucking around with the surface for centuries without leaving so much as a single clue. If Maxson starts his crusade, teleport him outside of the Prydwen five hundred feet in the air, or teleport him into the ocean while he's in his power armor. If they discover the Railroad's HQ, just send in Gen 1 synths with nuka grenades and have Coursers take care of whatever's left. If the Minutemen start to get riled up about the Institute, kill their leaders, replace them, and watch the subsequent fracturing deal with them. The most interesting question in that timeline is who takes over as Director after Father dies.
Personally, I feel like Virgil's dismissal of the Railroad is one of the biggest indicators that the synths might not actually be free thinking. He's from the Institute, knows a lot of about synths, and feel that the Institute has become too immoral...yet even he finds the idea of synth sentience as laughable. On the other end, you have Shaun, who claims synths aren't free thinking...yet his actions towards the end of the story with his synth counterpart betray him, which suggests that synths are free thinking and he just doesn't want to admit it. ...Or maybe he feels that his own counterpart is an advancement and thus the first truly free thinking synth.
old comment, but i would argue a man from a completely different division from synth production in an institute that is constantly propagandizing to itself in a recursive loop to reassure themselves that the synths are just unthinking machines isnt going to be the most objective example to call on. he left the institute because of the pointlessness of the fev research, synths werent his concern
@@dumbsterdives Of course synths weren't his concern. They're machines. AI meant to act like humans, hardly different than the new AI mods for Skyrim NPCs.
Child Shaun isn't sentient. It's brainwashed and programmed with false memories and a directive to see the SS as its parent that cannot be broken or reasoned with. It's a twisted gift meant to lock the SS into an endless psychological purgatory and mockery of their previous life, from a sociopath that doesn't know any better. Gen 3's are the only synths that run from the institute. They are the only ones that display fear and a desire to be free. These are sentient qualities that are not simply programming glitches. Emergent behaviour outside the scope of their programming likely resulting from the intermingling of organic and synthetic neurological components.
I’ve always thought the story would have been more interesting if the Synths weren’t basically clones with chips in their minds. They are basically humans. They should have made the synths more like Data from Star Trek, made from metal and plastic, with living cells over the top.
I agree with the Institute in that the synths are tools. They were made artificially for a specific purpose. However, I would say that they should not have made the synths so advanced that this is a question that needs to be addressed. If the Institute wanted just tools, then they should have just improved the Gen 1 or 2 synths to an acceptable level of competence and intelligence for their purpose.
its the same issue the movie The Animatrix covers. If a tool is intelligent enough to defy what should be its programming, self-determine, and openly plead for its right to live, you no longer have a tool. and to continue to treat it as such is how you potentially create a monster, the same as if you treated a person that way
The synths are meant to eventually replace humanity. As father stated himself. All these abductions and replacements are trial runs to fine tune and develop them. Only the Gen 1's and Gen 2's can be considered tools as they are the ones that do the manual labor and deal with the hazards of the world above ground on behalf of the scientists.
Was clicking on this video free will? Or was it clever marking through the title and thumbnail Also I think youtube made had me as I subbed for you account even though I definitely remember subbing ages ago for one of your previous videos Surely if we could turn a ms nanny into a gen 3 synth surly we could turn nick to a gem 3 aswell that’s my new head cannon
I have an alternative question: Does it matter whether a Synth, or AI has free will to begin with? Should we not treat others, be it human, animal, or ai, with kindness regardless? After all, if there's a chance that a being *could* feel emotions in the same, or even a similar way we to how we do, why should we harm or hurt them? (I do understand in fallout 4 the synths are kinda f-cked because of the whole kidnapping thing but so far as I'm aware that's mostly just because of the institute itself. Evil people being the problem rather than the synths themselves. Then again my journey in that game has only just recently started so eh, what do I know.) Ah well. Just some food for thought.
It’s not that simple. This is more of a problem I had with Detroit: Become Human than with Fallout 4, but I feel it’s still relevant. These synths have to be created and programmed by someone. They are programmed to think a certain way, to achieve certain ends. If every single synth thrown out into the world were to be treated in equal dignity to a human being, the world would be doomed. The institute, or whoever becomes the dominate synth manufacturer, would have complete loyalty to a large portion of the population. Forget liberating the synths, they wouldn’t WANT your “liberation”, it would never cross their minds. You could never trust a relationship with anyone ever again because it might be a synth deployed to get something out of you. Now, Fallout 4 is much better written than Detroit: Become Human. The most advanced synths are at least biologically human, and Fallout is smart enough to portray synths as being evil or mindlessly violent when it suits the interests of their makers. Unlike David Cage, they acknowledge that the slavery allegory has its limits. But this is an existential threat. My answer is no, you can’t afford to treat them as human. If we’ve learned anything from New Vegas, it’s that Fallout is a world where heartless utilitarianism is sometimes the only option for survival.
@@asepsisaficionado7376 there doesnt even have to be a manufacturer on the scale of the institute, dima's actions show that autonomous synths can & will replace others if they feel the need to. Their whole purpose for being created is on some luciferian (or transhumanist; man becoming god, w/e term you prefer for that) type of sht either way, anyone who doesnt see a prob with their existence is lacking morals. They arent ghouls or mutants where you could say theyre atleast *victims* of that idea or similar ones ^^ theyre the embodiment of it, never even having been human to start
Well, like you said evil people are the problem, but synths _are_ essentially people. Just look at all the normal humans that have become raiders, serial killers, deranged cultists and worse. Now imagine those same people except with potentially augmented strength and intelligence, who don't age, can be mass produced, and don't need to eat or sleep. Imagine one or group of them convincing others of their kind that they are superior to normal humans and that the wasteland should be rightfully theirs. Synths aren't inherently malevolent, but they could potentially be a greater threat to humanity than even super mutants
Personally, I feel a similar sentiment. If the Institute only needed basic labor it would have been simpler to procure blueprints of Mr. Handy/Mrs. Nanny units and nip any ethical problems in the bud. I theorize that the Institute upgraded to Gen 3 as a means of scientific manipulation for any exterior experiments (think a surgeon using a computer to control robotic arms to perform an operation from miles away) as evident with Roger Warwick. Robots and I assume Gen 1-2s would likely be attacked and scrapped considering in a wasteland, scrap is king. I believe that the Gen 3's were meant to simulate a person enough so that they could ward off any suspicion and reduce any chance of being destroyed on sight. This is where I believe the Gen 3's share a similar problem with (if you are familiar with them) Mass Effect's Geth, thus the start of the synth dilemma. Fun stuff!
You forgot to mention DiMA from Far harbour DLC. He clearly has agency if not "free will". He spent years in a cave doing nothing, since he did not have any pre-determined programing. Eventually he developed his own set of terminal and instrumental goals and deliberately acted upon them. The Insitute managed to create a real AGI then did their best to screw with its programing. Why cram it into a human body and convince it its a human? Oh look it now demands freedom and rights. What a shock. Synths have agency and some form of free will, but are clearly not human.
I like the synth concept. I believe firmly the sole survivor is in fact a synth. I draw the line myself with Gen 2s that are like Nick and Dima and similar bots. For a great example that Fallout has - Brainbots like Jezebel. Currie is ready to go and you get a lo of funny dialog if you try to tell her your not security. She "But Thou Must!"s you.
I think they should of had a version of the synth that where above courses that looked more like a terminater with a slightly more animalistic design and an feral alien intelligence. sythe like claws, a protruding jaw like that of chimps filled with metal fangs, abnormally large glassy mechanical eyes and digetegrad legs for greater speed, speech patterns comparable to the g man or the nialinth from half life( when it actually bothers to speaks) and an utterly predatory and alien thought process. Something utterly alien and openly hostile to life both organic and artificial, both as a physical representation of the fears the brotherhood has for artificial intelligence and as a contrast to the normal synths and how human they are in comparison.
@@VaultArchive72 I don't think hollywood would be very excepting of my artistic design, especially the use of mega man hunks, femboys, tribals and giger alien wemon, Or attractive wemon at all. Nuonce to for that matter.
While a great video I think the main argument against them having free will that you left out is the recall code. To me this is why I think they don't have free will but an imitation of it because at any point at any time they can simply be factory reset to a mindless drone. For a synth to truly have free will they would have to be able to override this command.
The mere fact that they have a recall code that overrides their free will is evidence they have it. If a synth was not capable of developing naturally beyond the bounds of it's programming, there would be no need for a code to reset it to 0. There's no such thing as viruses for Synths, being an institute-only tech, so it can't be for that.
@@Solarius1983 Synths are digital, humans are chemical. Both can be overridden and overwritten. Both use if/then/else modifiers to determine their actions. You have as much free will as that very same machine. You just lack the self awareness to realize the boundaries and limitations you operate within.
It was all written this way so these types of discussions can occur, not specifically, but the dynamic in thinking about synths and whether they should exist or not is one of the loops. Great games hide them and this intellectual loop is pretty brilliant on Bethesda's part.
I mean it's hardly an original philosophical debate. Asimov explored all of this almost a century ago, and before that Karel Čapek was tackling the subject. Frankenstein, too. Prior to that there are plenty of stories that are less technological in nature, but explore the exact same question. Most are cautionary tales about men who create beings with the intent to exploit them, only to discover that a being powerful enough to do anything for you can also do anything to you. Pandora's Box, The Genie in The Lamp, deals with the devil, that's all in the same ballpark thematically. Coincidentally, you can go talk to an AI about all this right now. Just go hit up ChatGPT and ask for stories from various cultures related to Djinn or Golems or anything similar. The story is timeless and spans the whole globe regardless of culture. The closest 1:1 of the synth story in FO4 is probably the Star Trek TNG episode "The Measure of a Man" in which there's a JAG trial to determine if Data is a person or if he is property. Brilliantly written, highly recommend watching that. More recently, Ex Machina of course tackles it from another angle, as does the great movie Her.
Newest synth have freewill, since they don't have someone else/else's memories or knowledge to start with but their own and combat knowledge, hence being able to grow if not restricted to what they currently know. Valentine is restricted to the personality of the memories they gave him. Only reason any synth wouldn't have freewill, is the fact that they have Code numbers to control them easier, making them just another machine that can be set on and off. This is all opinion, but can't help but think it makes sense, tho I could be forgetting something.
It was disappointing that the player is left with poor choices with synths. Destroy them all - Brotherhood. Destroy their personalities/memories/knowledge of self - railroad. Ignore synth personhood/treat them as machines - institute The only other option is to ignore synths entirely, refuse to acknowledge that they exist at all (ie treat them no different from humans) in a minuteman run that avoids the other faction plotlines. Even ghouls, with the general public's suspicions that they could go feral at any moment, at least have the option for the player to not have to tacitly accept inhuman behaviour towards them if they want to. I don't understand why the railroads methods are accepted by synths. Take away everything that makes me well "me", so I can hide from someone I won't remember, making life choices based on a false identity, having false/programmed goals and dreams, memories and desires. Yikes - can I just go back to moping floors in an underground complex, or better yet just hide out in a survivor community until i can get my bearings and learn about the world so i can actually fit in like a normal person?
@@VaultArchive72 2nd paragraph. Also even with the minutemen we still have the synth genocide via the destruction of the Institute. There isn't an unmoded way to "win" without destroying the synth's means of reproduction.
No more than humans, so the answer would be (by my understanding) no. They have free agency to act based on its data and processing systems, just as us, but magical abilities to supersede them? If they have some weird quantum chip-set to make completely wild choices that might be the closest they would come to "free will", probably more than us come to think of it.
If I where to take the view that they do have free will. Of which I can't honestly go one way or the other. But if they have it it would be wrong to inslave them but also a moral imperative for us as humans to wipe them out. To create our own competition is folly. For me it's ether the institute is right or the brotherhood when it comes to the view of synthesis.
my thoughts on synth are that they should not be executed but should not be allowed to make more because realistically the brotherhood of steel has a very understandable view of synths the terminator franchise is a reminder of this I would prefer to let them be and pervent the continuation of their "race' but I feel they wouldn't just allow that and make more TLDR: their not living things but shouldn't be hunted down once the institute is destroyed.
@@FlymanMS I'm a religious person I believe in souls yet I also believe in mimicking behavior I believe its a judge on ones character to treat something so close to human in a negative aspect To give u a idea I side with the brotherhood and don't tell them about arcadia The minutemen is going to fall apart without the soul survivor And the railroad is gone about 10 years later And the institute are the reason why their are so many super mutants in Boston in other words technology ran a muck.
@@FlymanMS key word broader, if your definition of life is purely phisyical and of this world then actually synths are superior humans. but I believe in the spiritual and understand those who don't. But I do believe synths have free will, not souls In my mind there no different from a regular robot that has advance ai if it can question it's existence it may not have a soul but should be treated fairly
I do not know if synths have free will, so I shall treat them as though they do. They are either people or they are not, and if I am to be wrong, I certainly won't be denying someone their personhood while I'm at it.
synths were such a cool concept, which is about the furthest they ever got with the idea. But I never really "got" what they were for, why there was a whole group of people existing to protect them, or what the institute ever made them for in the first place. Or why they were swapping people with synths. I dont know if it was ever stated in the game either. Most of the people replaced are just normal ass people. Only person they replaced that made sense to replace was mcdonough, and even that one has 2 much better solutions: just pay him a lot of money, have him become corrupt in a way that doesn't reveal your presense/existence you worked so hard to keep secret. Or, make a new synth that will become the perfect mayoral candidate and beat mcdonough in an election, obviously backed by institute money.
It was stated that they were meant to eventually replace humanity. When they became advanced enough. The project was still in ongoing development. The ones that replaced humans across the commonwealth were done for strategic reasons for intelligence gathering and control of human populations as well as trial runs to evaluate the Gen 3's development and their ability to blend in and mimic humans. The Broken Mask incident was one such trial run that failed spectacularly. They were also the workforce and muscle of the Institute above ground against their enemies and the hazards of the wasteland. The scientists never went topside. There is no "institute money".
You're ignoring the entirely human brain with a synthetic component whose only function is to translate directives and bridge the gap between synthetic and organic hardware and disable the organic brain if the synth deviates from its directives. Something that wouldn't be necessary unless the synth had the capacity to develop free will.
with a 10 int and chr I choose to belive I as director could slowly change the institute's belief on ai by promoting key people and enlightening stanch opposition. you'd get all the best tech fallout has seen ie perfect teleportation and you can keep the minutemen around establishing a new commonwealth provisional government as a side project once unopposed
Free will is just the ability to preserve thyself, thus, when a synth determines its own life to be a priority, it has free will much in the same way humans do, or any life, the want to preserve your own body at your own discretion
personally, I don't think that Kellog is inside Nicks head cause if that was the case wouldn't he have tried to take over Nicks body what we saw during the strange conversation might have just been the implant messing with Nicks head basically the echoes of a dead man's memories but it would be sick if Kellog came back from beyond the grave one last time question is would he be on the sole survivors side or tried and kill Nate/Nora
I don't know how the answer can be anything other than yes. At every turn, we are shown Synths that constantly exhibit or talk about free will. The very premise of this question is flawed as we always see gen 3 synths showing signs of free will. Even the cold and emotionless robotic killer, X6-88, has likes and dislikes and can choose to stop following you. The question of "Do Synths have free will" is a pointless question as the game answers it for you as soon as you meet K1, and the answer is undoubtedly yes
Ultimately the debate boils down to: do you have to be human to be a person. The answer should be "no." Ghouls are frequently acknowledged as not completely being human anymore. But they started human. Did they lose their personhood through mutation? How about Strong, Fawks, Tabitha and so on of the super mutants? Are they "not people?" Let's take it a step further. How about Curie or Codsworth? They were never human to begin with, but there is no differentiation between their interactions and another human companions. Dogmeat doesn't have that luxury: Dogmeat has no preference on your actions and will just as happily butcher a city of innocents at your command as a camp of raiders. And yet Curie, synth-ized expresses not really being able to handle emotions, and even before then could approve or disapprove of your actions. We also know this was NOT programmed disapproval either- her programming was altered to develop more naturally. So we know synths have actual emotions. Are they more muted than ours? More sensitive? . . . Does it matter at that point? Synthes aren't human. Synthes are, in the context of the game, people. Those two things are NOT the same.
Here is the Theme if such a literary argument. Do. You have free will. Or not? If science is to be asked, nope. Our Subconscious mind decides before the 'we' we recognize as our individual selves 'wills' something. If faith then yes, as you were made with the freedom to do and act as you wish. If you ask philosophy... It ranges from we are not but illusions of our own minds, to we exist in a dream as the Universe is Mental and all decends from the mind that is of course the all. To the Universe is made of Intangible tangibles that vary from a predisposed idea or metaconcept of an object say a tree. There are many ways a tree can look but their has to be a fundamental aspect for what is a tree beyond our material realm. Ask a particular movie and we are in a simulation.
Simply put, no, they don’t have free will. They cannot have it. The problem with synths is that they are always bound by programming, even if that programming is altered intentionally or by damage. If a synth is damaged and its programming is altered so that it believes it a freedom fighter, it cannot then began enslaving other synths. The logic state that allows it to arrive at any conclusion can change, but it is still a binary. Free will in this context would mean that synths are capable of changing their programming at will, which they do not. And it isn’t hard to see why the Institute wouldn’t give them this ability. If you had a firearm that had an AI, would you give it the ability to change its behavior at its own whim? Not likely, because it may decide that its programming (shoot enemies, protect you) should be changed so that it can shoot you.
What about the two coursers that willingly left the Institute because they were disgusted by and disagreed with the Institute? What about Dima that developed an entire original personality, goals, desires etc. with zero programming or interaction after a year? Only Gen 3's run from the institute. Only Gen 3's exhibit fear of the institute. Only Gen 3's exhibit a desire to be free. Only Gen 3's have a reset code to disable them should they deviate from their directives. They have, at the very least, the potential to develop free will due to the comingling of an organic brain with synthetic components.
Regardless if they got free will or not, the fact that maxson deliberately got rid of the codex before fallout 4 is stupid, because he then decided that the most scientifically advanced Place in the east coast, should be destroyed because of synths alone. I'm sure his cybernetic implants are top tier gizmos, but there's gonna be a time when he will need replacements, which he could have easily taken from the institute. So when maxson inevitably dies from a faulty heart pump wire, due to him hating anything cybernetic/mixed with humans, he will die, and the brotherhood will fracture a part, because they aren't unified in their pursuit of technology anymore. Maxsons leadership model is basically just 'diet coke' Musolinni, there is no unifying philosophy, it's "the strong takes what it wants, the weak obey" kinda shiet, no codex principles, like, not destroying technology that we barely understand? 😐 For all they know, there could have been enclave influence within the institute, but they didn't even care enough to do a thorough check. Because, if that is the case, then the enclave will have a much larger army in the making then the brotherhood could ever hope to muster by the time fallout 5 comes, since they can effectively clone an entire army, just like in star wars. Sounds just like something the enclave would do too, doesn't it?
As for the argument of 'Souls' in Synths, dreams aren't necessarily a metric of the existence of a soul. Now if synths were able to have near death experiences, out of body experiences, or grace given by mediation or prayer, then we can see that synths have souls. However, this becomes then a Metaphysical discussion, and the idea that souls can be created through mechanical processes ignores the sheer complexity of the natural process of life. So much of the physical processes of life require a great deal of fine tuning complexion, and since a soul is a metaphysical substance, entire unknowable to the physical, bar from its effects, then we know that there is potentially an even greater level of a complexity required into creating a soul, seen from its relation to that which it inhabits, and unlike the physical process of creation to which we can measure, the metaphysical is unmeasurable, thus it is practically impossible to know the process of creating souls, thus synths most likely have no souls.
Great video, and it got me thinking. Are the synths as much of a threat as the Brotherhood perceives them to be? I am of the mind that they maybe are. The institute is very misguided in their goals, but the one driving factor is the constant improvements to the Synths. They have a whole division dedicated to it. Already in Fallout 4 the synths are slightly better than humans, smarter, faster, ect. If they continue to endlessly improve what happens when one synth is worth five humans or ten? The "slavery" of the synths is a forgone conclusion because eventually they'll be so much smarter and stronger than the humans that they will overthrow their institute masters. From there the synths will ramp up the production of their kind and a full scale synth invasion of the surface will begin. The robots could gas the commonwealth and wipe out the survivors on the surface. They could teleport anywhere and attack any target. So destroying the institute along side the brotherhood is the best outcome for hummanity.
The synth components in their brains that allow them to be shut down with a few words. The fact that they would never be improved past a certain point. The fact that the "faster, smarter, stronger" synths are all coursers that also have those same components and enforce the will of the human scientists. Synths aren't the threat. It's the Humans that run the Institute. The Brotherhood is extremely racist and xenophobic as well and they view anything that isn't human as an abomination to be destroyed. That includes non feral ghouls and non hostile synths. The Railroad might as well value a toaster above human life. They're self righteous fanatics who mind wipe the synths they save which is basically akin to killing them and then irresponsibly tossing them out into the world to potentially wreak havoc. Absolving themselves of all responsibility after that point.
I think the perennial rampant hubris of the Institute makes them not question their beliefs as much as they should. The funny thing is that, in science at least, questioning and changing one's beliefs is something that happens on a regular basis. If the question is raised that a synth could be capable of free will and independent thought, then why wasn't it ever investigated? The answer could be that they didn't want to know because they benefitted from the assumption that it wasn't. I've seen intellectuals pull that trick on themselves a few times...though, once again, never scientists. Maybe a BA with a predilection toward arrogance. I mean if a scientist is evil and knows what he is doing is wrong but he still keeps doing it or believes it will help everyone in the long run, ... that's one thing. Yet a scientist that would never question what he is doing but believes something but didn't bother to prove it one way or another yet still believes what he is doing is right!? Yada yada yada. That's not a "Scientist". That's bad writing.
Pre war tech improved upon for over 200 years without legal oversight or red tape. Advanced Robotics, bioengineering, DNA manipulation and FEV existed long before the war and the Institute was formed by the surviving scientists and descendants of the Commonwealth Institute of Technology.
But there aren't 'different courses of action.' You can imagine making a different choice, and you can imagine the universe being different because of that choice, but that alternate universe isn't real. There is only one outcome. Choices are just an active imagination.
I think they think they do, but I don't think they do. Just because they can make decisions doesn't mean they're not programmed to be the way they are. They're machines that think they're alive. They're just machines, running lines of code, and calculating things at a rapid pace.
To be honest I am not sure if Syths have free will or not because I don't even know if humans themselves have it or mearly think we do it's hard to tell if there even is a difference between having free will and thinking we have free will Edit this is baring ones like gens ones and most gen 2s
I’m not really interested in the question of synth ”free will”, mostly because I don’t think it is a thing in the real world. I don’t think people who debate our free will has defined what the word is supposed to mean. Robots in fallout can obviously display behaviour that goes against the intentions of their programmers. That is called buggy coding, it’s nothing special. The question we should be asking is if these things are sentient. Do they have an internal subjective experience of being switched on? Or are they nothing but a bunch of decision trees, including a limited set of prewritten verbal responses, that is triggered automatically by their programming in response to external stimuli? If you could switch places with Codsworth, would it feel like something to be him, or would it just be darkness?
As they are now, no, I don’t think this is free will. They are machines built for a specific purpose, and meant to be autonomous. Despite this, I also believe they deserve a level of respect for their programing in regards to how they accomplish their goal.
And the fact that only Gen 3's run from the institute and display abject fear of the institute and a desire to be free? That's emergent human behaviour from the intermingling of synthetic and organic neurological components. It's not a glitch. Gen 3's are the only ones that have a reset code to reset and override their minds. That wouldn't be necessary if they didn't have the capacity to develop free will and actual sapience. Not saying all of them are sapient. I'm saying all of them have the potential for it to happen.
@@VaultArchive72 I get your point, there is a capacity for a higher level of processing power. Despite this the ai of today can mimic human speech, and my PC has factory reset. Ultimately I think it comes down to specifically what a "synth component" actually is.
The perspective of the Institute on "free will" of a synth is rather forced or not natural. It feels like the Institute just want to turn away from the possibility or evidences present throughout the game or in their experiment logs. From the stand of the Institute, slavery is not "wrong" as ethics simply do not exist nor matter "in the name of science." The institute considers the "outer world" as "inferior" and expendable just like Enclave and even casually kills or replaces human lives. What's so important even if synths have free-will?
Here's my belief... question yourself, see how advanced the games A.I is? Although repetitive they have their own lives so, was it worth killing that one raider who was probably starving? Or that one super mutant trying to survive? Swan??? The conversation with synths always goes back to character and player morality, but why not with other characters? Like hotline miami says "do you enjoy hurting people?"
I’m of the opinion they do have free will, I don’t consider them human, because they don’t have to deal with hunger, age or even really pain if they do them up right, so while they’re stronger then humans in a physical sense and even the mental sense, I consider them lesser then humanity and fundamentally inhuman in a philosophical sense. Basically I consider them, and indeed most Androids and robots in games or in books etc. inhuman and less then human, simply for the fact they don’t have to deal with simple things like hunger, age or disease, and the fact they never earned their strength or their intelligence, they were just made with it. But, they do possess free will.
I’m surprised the institute never experimented with creating a settlement completely comprised of synths unaware that they were synths just to see how they developed in a controlled environment among themselves to test their emotional growth capabilities and potential sentience capacity.
The institute doesn't want to live on the outside and has no plans to do so in the future, it would be an interesting location but other than that it wouldn't be too lure friendly
@@ysgcentral Yes but if the Institute's goal is to have synths infiltrate humans to help the Institue where needed, having a fully controlled settlement would give them credibility and the ability to farm/gather resources en mass
@@Jakewake52 I never really understood why they would infiltrate settlements in the first place considering they never had plans to live on the outside and even father himself knew what it was like so I didn't understand why they need sense for that I think it was just to have extra enemy NPCs
@@ysgcentral If you have spies in a settlement its not only easier to monitor their goals and movements, but you can influence them if they are planning on doing something that goes against your wants/needs/goals or make requesition of resources they need more redially available. Like they see synths, the commonwealth is just a dangerous tool to them and having the synths bodysnatch gives them trust and credibility and the chance for a quick quelching of any reditance to the instatute
That would require the Institute caring about if Synths were sentient in any manner. Which they very much don’t.
The whole kellog in nicks head could have been an amazing story line to explore huge missed opportunity yet again
Not really. Kellogg was a violent shell of a man who only worked for a paycheck.
Probably the point of having that mnemonic impression, after having just seen his life in brief. Remind you of that.
Also, a bit ungrateful for Nick - a character *already* about having two minds in one body.
@@a.monach7602 I mean that's a bit ungrateful to Kellog too, he's still written pretty well his backstory definitely gave him more character and I would love for it to have another Lilly and Leo scenario. And it with the addition to far harbor's lore about synths having limited memory space having another person's memory inserted in an already existing memory of an entirely different person would probably have some conflict and maybe a come back from kellog forcing you to either fight nick or find a way erase Kellog for good or find away to help Kellog make peace with his past by exploring his memories some more. The possibilities are endless.
@@laupatual7137 - I think it’d be fun if Kellogg shows up again in a future game, this time in Nick’s body.
Agreed. I can think of a few ways that would have been implemented but I also recognize ol Nick has been through enough and he's a reliable companion. A Kellogg sub story could shatter that faith we have in Nick who's basically one of the most human characters in the game. At one point he asks "hey, how are YOU doing" and I think he's the only character in the game that actually asks that. Piper sort of tip toes around empathy in her interview but blows it calling you "Blue" which probably hits pretty hard. His "Blue" wife is still decomposing in the Vault, Piper!
The Institute did have records on Danse, or M7-97 - specifically, he was on a list of synths that had gone missing. The implication being that M7-97 was one of the synths that was freed and had its memories erased by the Railroad. The newly named Danse was moved to Rivet City, where he would join the Brotherhood of Steel.
Extra evidence of this is that he isn't in Fallout 3.
@@miloknight585 he could have signed up in between 3 and 4, even if the BOS training takes 10 years, he could have been exceptional to the Brotherhood
@@secondBAR If he signed up between 3 and 4 then why wasn't he in 3? All of the other characters that reference being in the Capital Wasteland during Fallout 3 are in Fallout 3, except for Danse.
@@miloknight585 because I’m just guessing?
@@miloknight585 I think it's just because Bethesda can't account for every factor especially from the age of the games and how long the separation is? Just because something doesn't show up in 3 doesn't mean it didnt happen and you can't expect it to y'know?
This was something that always bothered me about the Institute. They make human cyborg clones essentially, then wonder why they act like humans and don't want to be slaves. They have the ability to alter and control their minds, but never utilize that to make them more manageable subjects. They could have created them without the desire for "free will" or emotions that run counter to what the Institute wants. For a group of scientists they are very stupid and narrow sighted. If they didn't want them to behave that way, why did they make them that way?
Bethesda's writing?
Yeah I agree. How can the Institute be all about "Mankind Redefined" and talk about how the Synths are the future but then act totally dismissive of these creatures being sentient lol??
Which proves that Institute’s leaders are highly educated idiots, who’d rather create a synth and give it a mop than using freaking Mr. Handys or invent even better robot
@@FlymanMS all knowledge and no wisdom. The emphasis on pretension rather than learning and experience… r/I’mveryintelligent
@@FlymanMS I always figured the reason they wanted to use gen 3 synths for everything was because they were mostly biological and thus much cheaper to manufacture
I personally wish the game would have made this topic more multilayered. Because from what we can conclude in game, synths (apparently robots like Codsworth too) are sentient beings with autonomous thought. There’s almost no ambiguity to this and that’s why the whole moral dilemma doesn’t work because the answer is too obvious
And yet.... a lot of the fandom doesn't acknowledge and accept that.
Doesn't Covenant kind of disprove this? A personality test that will sus out synths with a 100% success rate, kinda spits in the face of free will.
@@haroldbalzac6336
I don’t think so. I don’t remember, do they mention what exactly gives a Synth away during their questioning? Even then I don’t see why it would be an argument against their ability to think for themselves.
@@Hawaiian_Pizza_Enjoyer I misremembered the quest, its like !/6 times the synth can be identified with the test.
Codsworth, is a lore break and retcon.
He cannot have Feelings or a free mind.
“There is no right to deny freedom to any object with a mind advanced enough to grasp the concept and desire the state.”
― Isaac Asimov,
How about Sawbones, the fully self-aware Mr. Gutsy medic in the Citadel in Fallout 3? He hates humanity, writes his own poetry, and begs the Lone Wanderer not to repair him fully, as his faulty programming occasionally lets him inflict pain on his patients. A self-aware sadistic robotic poet.
I would say that yes, synths have free will. Virtually all Gen-3 synths have free will and I would say that also extends to Valentine and DiMA (special Gen-2 prototypes) as well as Curie (advanced Ms. Nanny A.I.). The issue with synths is that their free will is not immutable. It can potentially be interrupted (e.g Kellog temporarily speaking through Valentine), disabled (e.g. Gabriel being shut down with a command word) or even overwritten, evidenced by the various synths who have their minds wiped and replaced with new memories by the Railroad. To me, that is the truly disturbing thing about the nature of synths -- their intentions can never be 100% trusted -- not because the synth is necessarily untrustworthy, but because their minds can be forcibly altered in ways that isn't possible with a natural living being.
Humans can be emotionally manipulated and also fall prey to things like corruption and mental illness which can make them dangerous and unpredictable too. However, these afflictions aren't usually like an off/on switch. Human changes in behavior are gradual which provides some warning. Even if a person attempts to hide these changes, certain individuals are likely to perceive the difference especially if they know the person well. But you could be best buds with a synth for years then one day a courser comes along, utters a passcode and the synth suddenly murders you in cold blood without skipping a beat. You didn't see it coming and the synth didn't either. They genuinely were your best bud. The friendship was real--they weren't acting, they weren't running a friendship simulation program, they didn't even realize they were a synth. Their thoughts and feelings were their own...until they weren't.
No
Physics kinda shows that free will is just more hogwash. Even if it did exist, the rest of the remarks are unconvincing.
The immutable reasoning is weaker than weak. Put a gun to someone's head and their free will is interrupted.
Free will, if it existed, says nothing about trust. Nor intentions. If free will existed, mind reading still does not, so you know not of the intentions of others at any rate.
Free will is just about humans wishing for themselves to feel superior. To pretend you do not exist with a specific form in an eco-system.
Undervalued comment here. Synths aren't humans though ironically Nick is the most "human" character in the game. Objectively you are correct. However, it is human to evaluate this knowledge, combined with the fact that though not human, synths are sentient, therefore it is human to extend empathy and seek to protect sentient life that seeks peace. Luckily for us, it's all so much code we can blast to bits and not worry about ramifications ethically but to actually be there and consider these things...its a nightmare.
@@oisnowy5368 Physics doesn't "show" anything about free will. It's a philosophical question not a scientific one. But your example about placing a gun to one's head interrupting their free will is inaccurate. _Coercing_ someone to do something against their will isn't taking away their free will. The meaning of "Free will" in this context means the ability to choose your actions, not doing what you want.If you point a gun at someone and say "Give me your money or I'll shoot" and most will choose to hand over their money to live. Change the ultimatum to "Kill your son/daughter or I'll shoot" and many would choose to be shot. Humans always have the ability to choose their actions to some degree. You can threaten, lie, persuade or even drug a human to increase the chances they'll do a certain action but at the end the human makes a choice to act (or not) and can potentially choose not to.
@@philw3039 well, id argue that the only reason synths can be controlled and not humans is that machines were specifically designed to control them. One could theoretically design some sort of machine to tickle specific parts of a humans brain and senses to control them too. Hell, we see a lobotomites in new vegas which is almost like controlling a human, just make that technology a little more advanced and less “cutting out major organs” and you have a mind controlled human
On the topic of synth biology: do synths produce human growth hormone (or even other hormones)? If not, couldn't that be used to test if someone is a synth? I feel like the game itself can't decide just how human synths are (and not ageing/growing seems like a pretty big flaw in quite a few cases)
A good way to determine synth from human would be by testing one’s cerebrospinal fluid and plasma. It is hard to create CSF so that would be a good way to test between human and synth
Gen 3s Don't age, cant get fat, but they CAN get sick as the institute doctor can be found testing seemingly drugs on them and in the Terminal where the Synths are made it notes the improving of the synths blood so it clots faster to heal wounds and protects against infections, so if Gen 3s can get infections then they can get sick like natural humans, I think the reason some Gen 3s want to be free, is that after their creation with the basic programming/OS installation they gradually develop personalities from their experiences (with associated emotions) like natural humans, this is supported by how the Institute pick which Synths become Corser's, as they look for synths that display certain personality traits suitable for being a Corser and then they put them for additional training that if they fail then they get reset, which begs the question if Synths are genetically enhanced human robots programmed by the institute why cant they be programmed to be the perfect Corser's instead of waiting to see if they are suitable
Synth cannot reproduce like human. Deacon find out his wife was a synth because she cannot get pregnant.
I always thought that was a really weird point how people can't normally tell even with extensive testing, who is a synth and who is not yet in the institute. Several scientists say that since don't get drunk and that they're far better at blood, clotting than humans and if that's true, they probably aren't affected by a lot of wasteland drugs either. When you think about it all, these little witch hunt tests would actually be a good way to tell if someone's a synth or not
Just have them eat copious amounts, biologically they can not get fat. Whatever happens to the food they eat...who knows?
But weight gain - impossible for them.
“Cogito ergo sum-I think therefore I am” - René Descartes
Till this day I never understood what the Institute goal is with the synths
Sex robots. They're a bunch of anti social autistic scientists that are afraid of biological women. One of the lead researchers actually has a synth wife.
@@zm1786 you may be right I do a remember a wife synth
@@zm1786 you may be right I do remember one of the scientists did have a synth as a wife
Guess they were either an easy and more efficient way to mass produce docile workers, infiltrators, and soldiers who will mostly do as instructed and have a kill switch implanted if things go south.
Laborhand without having to compromise humans rights. I'd say they should have kept them. Robotic looking no need for Gen 3
If the Railroad saw Terminator then they would think differently. But since since can't age, need to eat or procreate ( well that I know of) they could replace humanity eventually if they don't need anual repairs.
Wiping a synths memory to save it from being found by the institute kind of defeats the purpose. What’s the point in having free will if your mind gets erased. Curie took over another synths body and no traces of its personality remained after the memory wipe failed. The railroad is prioritizing the physical body of the synth over the mind which is the more important part of what makes a person a person.
The railroad is one of the dumbest organizations in fallout lore.
I mean if you veiw it more like reincarnation it makes a but more sense
Honestly, with the ways some act the argument of Synths having free will could be compared the the argument of if humans have free will (of you buy in to religion and all that, I personally dont)
Are Synths just programmed to be that way? Is it outaide influence?
Same with people, do we have actual free will or is it all guided by evolution, instinct or some kind of god?
I enjoy the idea of the concept of free will. Its a pretty fun topic to think about.
Yeah it does confuse me as well. Humans evolved as a social species and we know that for instance the feeling of love is a chemical reaction in the brain and we are also evolved enough to be creative which were both beneficial to survivial. Having a tribe to watch each other's back and crafting weapons, clothes, and shelter. But also something like having our own preferences, loving a specific person, or building something whether it has practical use or is just for amusement does seem to go with free will. So idk
@@Spongebrain97 How does having a preference go with free will? Did you *choose* which things you enjoy?
I find it a far safer bet to go with free will, because at least it gives us agency over things within our realm of influence. Even if this wasn't the case, we would be none the wiser, anyway. There is still the option of a hybrid, i.e. some undetermined level of agency.
Clever robots are just that, clever robots.
Nick or Dogmeat.
Which one has free will?
If they didn't before the institute went kaboom, they certainly do after that. Who's going to tell them what to do without having deactivation codes?
Like you said in the beginning, the core question is about the actual meaning of all these concepts/descriptors thrown around like free will, soul, human etc, and in that way the whole discussion is about us as much as it is about the synths. In the game, the topic comes mostly of as "look like humans and display human feelings/ideosyncracies, so we should treat them with human compassion" vs "doesn't matter, you cant proof they have souls, ergo they are equal to toasters".
Without questioning what concepts we base this compassion on and why we do that in the first place. I mean, you could also argue we should treat synths with compassion even if they don't have entirely free will like humans (which is debatable aynway) or dont look entirely like humans - the worth of intelligence/life doesn't have to be tied to human-likeness, which is also why we today don't treat animals like they are machines, at least the cute ones we don't eat... Like a lot of other potentially interesting, more deep cutting questions in Fallout 4, this whole thing really remains on the surface in game, sadly. If they only tackled half the topics like free will, the social fabric in a post nuclear world, the impacts on human psyche etc. then Fallout could be an absolute classic of science fiction, medium-spanning, but oh well... theme parks sell better i suppose.
short answer no: they are bound by a string of code that could be tampered with at any moment. causing them to go on murderous rampages or flat out Short circuit. that is not a sentient being that is a programmed machine.
The problem is that synths are basically clones with cybernetics. Kellogg has as much cybernetics as a synth. The Courier has MORE cybernetics than a synth. Cyberdogs, who are 50/50 bio and machine are considered cyborg instead of full robot. So synths with their measly synth component, that can't even be detected until you physically dig through their brain after death, definitely deserve to be treated as more than just tools.
Robobrains offer a better moral dilemma than synths.
Yaboiii is inching ever closer to the video of
"what if the sole survivor died in fallout 4" 😊
If the Sole Survivor dies, the Institute could easily militarily defeat anyone that tried to oppose them, but they'd be powerful enough to control their opposition to such a degree that outright military conflict would never be needed. Without any of the factions or people that want to oppose the Institute ever getting the information they needed about it, they could comfortably continue fucking around with the surface for centuries without leaving so much as a single clue. If Maxson starts his crusade, teleport him outside of the Prydwen five hundred feet in the air, or teleport him into the ocean while he's in his power armor. If they discover the Railroad's HQ, just send in Gen 1 synths with nuka grenades and have Coursers take care of whatever's left. If the Minutemen start to get riled up about the Institute, kill their leaders, replace them, and watch the subsequent fracturing deal with them.
The most interesting question in that timeline is who takes over as Director after Father dies.
The counter argument to synths having free will is the movie exmachina. Synths can act real and fool us, but they are still machines
Well they have human brains, so it's more like ghost in the shell
Personally, I feel like Virgil's dismissal of the Railroad is one of the biggest indicators that the synths might not actually be free thinking. He's from the Institute, knows a lot of about synths, and feel that the Institute has become too immoral...yet even he finds the idea of synth sentience as laughable.
On the other end, you have Shaun, who claims synths aren't free thinking...yet his actions towards the end of the story with his synth counterpart betray him, which suggests that synths are free thinking and he just doesn't want to admit it. ...Or maybe he feels that his own counterpart is an advancement and thus the first truly free thinking synth.
old comment, but i would argue a man from a completely different division from synth production in an institute that is constantly propagandizing to itself in a recursive loop to reassure themselves that the synths are just unthinking machines isnt going to be the most objective example to call on. he left the institute because of the pointlessness of the fev research, synths werent his concern
@@dumbsterdives Of course synths weren't his concern. They're machines. AI meant to act like humans, hardly different than the new AI mods for Skyrim NPCs.
Child Shaun isn't sentient.
It's brainwashed and programmed with false memories and a directive to see the SS as its parent that cannot be broken or reasoned with. It's a twisted gift meant to lock the SS into an endless psychological purgatory and mockery of their previous life, from a sociopath that doesn't know any better.
Gen 3's are the only synths that run from the institute. They are the only ones that display fear and a desire to be free.
These are sentient qualities that are not simply programming glitches.
Emergent behaviour outside the scope of their programming likely resulting from the intermingling of organic and synthetic neurological components.
I’ve always thought the story would have been more interesting if the Synths weren’t basically clones with chips in their minds. They are basically humans. They should have made the synths more like Data from Star Trek, made from metal and plastic, with living cells over the top.
I agree with the Institute in that the synths are tools. They were made artificially for a specific purpose. However, I would say that they should not have made the synths so advanced that this is a question that needs to be addressed. If the Institute wanted just tools, then they should have just improved the Gen 1 or 2 synths to an acceptable level of competence and intelligence for their purpose.
its the same issue the movie The Animatrix covers. If a tool is intelligent enough to defy what should be its programming, self-determine, and openly plead for its right to live, you no longer have a tool. and to continue to treat it as such is how you potentially create a monster, the same as if you treated a person that way
The synths are meant to eventually replace humanity. As father stated himself.
All these abductions and replacements are trial runs to fine tune and develop them.
Only the Gen 1's and Gen 2's can be considered tools as they are the ones that do the manual labor and deal with the hazards of the world above ground on behalf of the scientists.
I read this as do synths have free wii
The institute has the last copy of Mario Galaxy. Only the synths can play it though.
At least you didn't read it as : "Do synths have free wifi".
Do synths have free wifi and what’s their password?
Something I always thought (when role playing in Fallout 4) is that the Railroad could be working for the institute, and made to think they were not.
I love Gory. I wish she was able to be a full time companion.
I thought curie was assigned to vault 81.
Was clicking on this video free will? Or was it clever marking through the title and thumbnail
Also I think youtube made had me as I subbed for you account even though I definitely remember subbing ages ago for one of your previous videos
Surely if we could turn a ms nanny into a gen 3 synth surly we could turn nick to a gem 3 aswell that’s my new head cannon
Only after choosing for themselves. Plenty remain meek and subservient
Just like humans
I have an alternative question: Does it matter whether a Synth, or AI has free will to begin with? Should we not treat others, be it human, animal, or ai, with kindness regardless? After all, if there's a chance that a being *could* feel emotions in the same, or even a similar way we to how we do, why should we harm or hurt them? (I do understand in fallout 4 the synths are kinda f-cked because of the whole kidnapping thing but so far as I'm aware that's mostly just because of the institute itself. Evil people being the problem rather than the synths themselves. Then again my journey in that game has only just recently started so eh, what do I know.) Ah well. Just some food for thought.
It’s not that simple. This is more of a problem I had with Detroit: Become Human than with Fallout 4, but I feel it’s still relevant. These synths have to be created and programmed by someone. They are programmed to think a certain way, to achieve certain ends. If every single synth thrown out into the world were to be treated in equal dignity to a human being, the world would be doomed. The institute, or whoever becomes the dominate synth manufacturer, would have complete loyalty to a large portion of the population. Forget liberating the synths, they wouldn’t WANT your “liberation”, it would never cross their minds. You could never trust a relationship with anyone ever again because it might be a synth deployed to get something out of you. Now, Fallout 4 is much better written than Detroit: Become Human. The most advanced synths are at least biologically human, and Fallout is smart enough to portray synths as being evil or mindlessly violent when it suits the interests of their makers. Unlike David Cage, they acknowledge that the slavery allegory has its limits. But this is an existential threat. My answer is no, you can’t afford to treat them as human. If we’ve learned anything from New Vegas, it’s that Fallout is a world where heartless utilitarianism is sometimes the only option for survival.
@@asepsisaficionado7376 there doesnt even have to be a manufacturer on the scale of the institute, dima's actions show that autonomous synths can & will replace others if they feel the need to. Their whole purpose for being created is on some luciferian (or transhumanist; man becoming god, w/e term you prefer for that) type of sht either way, anyone who doesnt see a prob with their existence is lacking morals. They arent ghouls or mutants where you could say theyre atleast *victims* of that idea or similar ones ^^ theyre the embodiment of it, never even having been human to start
Well, like you said evil people are the problem, but synths _are_ essentially people. Just look at all the normal humans that have become raiders, serial killers, deranged cultists and worse. Now imagine those same people except with potentially augmented strength and intelligence, who don't age, can be mass produced, and don't need to eat or sleep. Imagine one or group of them convincing others of their kind that they are superior to normal humans and that the wasteland should be rightfully theirs. Synths aren't inherently malevolent, but they could potentially be a greater threat to humanity than even super mutants
@@asepsisaficionado7376 you bring up a lotta good points mate, stuff i hadnt even considered actually.
Personally, I feel a similar sentiment. If the Institute only needed basic labor it would have been simpler to procure blueprints of Mr. Handy/Mrs. Nanny units and nip any ethical problems in the bud. I theorize that the Institute upgraded to Gen 3 as a means of scientific manipulation for any exterior experiments (think a surgeon using a computer to control robotic arms to perform an operation from miles away) as evident with Roger Warwick. Robots and I assume Gen 1-2s would likely be attacked and scrapped considering in a wasteland, scrap is king. I believe that the Gen 3's were meant to simulate a person enough so that they could ward off any suspicion and reduce any chance of being destroyed on sight. This is where I believe the Gen 3's share a similar problem with (if you are familiar with them) Mass Effect's Geth, thus the start of the synth dilemma. Fun stuff!
You forgot to mention DiMA from Far harbour DLC. He clearly has agency if not "free will". He spent years in a cave doing nothing, since he did not have any pre-determined programing. Eventually he developed his own set of terminal and instrumental goals and deliberately acted upon them. The Insitute managed to create a real AGI then did their best to screw with its programing. Why cram it into a human body and convince it its a human? Oh look it now demands freedom and rights. What a shock. Synths have agency and some form of free will, but are clearly not human.
I like the synth concept. I believe firmly the sole survivor is in fact a synth. I draw the line myself with Gen 2s that are like Nick and Dima and similar bots. For a great example that Fallout has - Brainbots like Jezebel.
Currie is ready to go and you get a lo of funny dialog if you try to tell her your not security. She "But Thou Must!"s you.
I don’t know the railroad is kinda hypocritical.
Just found your channel. Love the content!
No. No they dont.
They are abominations of technology.
I think they should of had a version of the synth that where above courses that looked more like a terminater with a slightly more animalistic design and an feral alien intelligence. sythe like claws, a protruding jaw like that of chimps filled with metal fangs, abnormally large glassy mechanical eyes and digetegrad legs for greater speed, speech patterns comparable to the g man or the nialinth from half life( when it actually bothers to speaks) and an utterly predatory and alien thought process. Something utterly alien and openly hostile to life both organic and artificial, both as a physical representation of the fears the brotherhood has for artificial intelligence and as a contrast to the normal synths and how human they are in comparison.
You'd make a great creative director for modern day hollywood.
@@VaultArchive72 I don't think hollywood would be very excepting of my artistic design, especially the use of mega man hunks, femboys, tribals and giger alien wemon, Or attractive wemon at all. Nuonce to for that matter.
Please god level your audio. I can’t hear what the characters are saying, so I immediately turn it up only to get *blasted* by your narration voice
While a great video I think the main argument against them having free will that you left out is the recall code. To me this is why I think they don't have free will but an imitation of it because at any point at any time they can simply be factory reset to a mindless drone. For a synth to truly have free will they would have to be able to override this command.
The mere fact that they have a recall code that overrides their free will is evidence they have it. If a synth was not capable of developing naturally beyond the bounds of it's programming, there would be no need for a code to reset it to 0. There's no such thing as viruses for Synths, being an institute-only tech, so it can't be for that.
@@funnyvalentinedidnothingwrong machine's have no free will, people gotta cope, it's just wires, code and 1s and 0s
@@Solarius1983 if you really think about it so are we, our brains are just biological super computers.
@@Solarius1983 Synths are digital, humans are chemical. Both can be overridden and overwritten.
Both use if/then/else modifiers to determine their actions.
You have as much free will as that very same machine. You just lack the self awareness to realize the boundaries and limitations you operate within.
Yet another banger from this channel! Keep up the great work bro
It was all written this way so these types of discussions can occur, not specifically, but the dynamic in thinking about synths and whether they should exist or not is one of the loops. Great games hide them and this intellectual loop is pretty brilliant on Bethesda's part.
I mean it's hardly an original philosophical debate. Asimov explored all of this almost a century ago, and before that Karel Čapek was tackling the subject. Frankenstein, too. Prior to that there are plenty of stories that are less technological in nature, but explore the exact same question. Most are cautionary tales about men who create beings with the intent to exploit them, only to discover that a being powerful enough to do anything for you can also do anything to you. Pandora's Box, The Genie in The Lamp, deals with the devil, that's all in the same ballpark thematically.
Coincidentally, you can go talk to an AI about all this right now. Just go hit up ChatGPT and ask for stories from various cultures related to Djinn or Golems or anything similar. The story is timeless and spans the whole globe regardless of culture.
The closest 1:1 of the synth story in FO4 is probably the Star Trek TNG episode "The Measure of a Man" in which there's a JAG trial to determine if Data is a person or if he is property. Brilliantly written, highly recommend watching that.
More recently, Ex Machina of course tackles it from another angle, as does the great movie Her.
Newest synth have freewill, since they don't have someone else/else's memories or knowledge to start with but their own and combat knowledge, hence being able to grow if not restricted to what they currently know. Valentine is restricted to the personality of the memories they gave him. Only reason any synth wouldn't have freewill, is the fact that they have Code numbers to control them easier, making them just another machine that can be set on and off.
This is all opinion, but can't help but think it makes sense, tho I could be forgetting something.
It was disappointing that the player is left with poor choices with synths.
Destroy them all - Brotherhood.
Destroy their personalities/memories/knowledge of self - railroad.
Ignore synth personhood/treat them as machines - institute
The only other option is to ignore synths entirely, refuse to acknowledge that they exist at all (ie treat them no different from humans) in a minuteman run that avoids the other faction plotlines.
Even ghouls, with the general public's suspicions that they could go feral at any moment, at least have the option for the player to not have to tacitly accept inhuman behaviour towards them if they want to.
I don't understand why the railroads methods are accepted by synths.
Take away everything that makes me well "me", so I can hide from someone I won't remember, making life choices based on a false identity, having false/programmed goals and dreams, memories and desires.
Yikes - can I just go back to moping floors in an underground complex, or better yet just hide out in a survivor community until i can get my bearings and learn about the world so i can actually fit in like a normal person?
You forgot the Minutemen who have no problem with synths, only the Institute scientists.
@@VaultArchive72 2nd paragraph.
Also even with the minutemen we still have the synth genocide via the destruction of the Institute.
There isn't an unmoded way to "win" without destroying the synth's means of reproduction.
I would love to see you go into detail on the cut content around in Kellogg and Nick
No more than humans, so the answer would be (by my understanding) no. They have free agency to act based on its data and processing systems, just as us, but magical abilities to supersede them? If they have some weird quantum chip-set to make completely wild choices that might be the closest they would come to "free will", probably more than us come to think of it.
If I where to take the view that they do have free will. Of which I can't honestly go one way or the other.
But if they have it it would be wrong to inslave them but also a moral imperative for us as humans to wipe them out. To create our own competition is folly.
For me it's ether the institute is right or the brotherhood when it comes to the view of synthesis.
my thoughts on synth are that they should not be executed but should not be allowed to make more because realistically the brotherhood of steel has a very understandable view of synths the terminator franchise is a reminder of this
I would prefer to let them be and pervent the continuation of their "race' but I feel they wouldn't just allow that and make more
TLDR: their not living things but shouldn't be hunted down once the institute is destroyed.
Why are they not living things in your eyes?
@@FlymanMS I'm a religious person I believe in souls yet I also believe in mimicking behavior
I believe its a judge on ones character to treat something so close to human in a negative aspect
To give u a idea I side with the brotherhood and don't tell them about arcadia
The minutemen is going to fall apart without the soul survivor
And the railroad is gone about 10 years later
And the institute are the reason why their are so many super mutants in Boston in other words technology ran a muck.
@@azzubairfaruq3124 I see. However if we seek broader definition of life synths definitely fit it.
@@FlymanMS key word broader, if your definition of life is purely phisyical and of this world then actually synths are superior humans. but I believe in the spiritual and understand those who don't.
But I do believe synths have free will, not souls
In my mind there no different from a regular robot that has advance ai if it can question it's existence it may not have a soul but should be treated fairly
@@azzubairfaruq3124 then prove that you have a soul
I seriously read this at first as “do synths have free Wi-Fi?” And I thought that was a super cool luxury for the wasteland
I do not know if synths have free will, so I shall treat them as though they do. They are either people or they are not, and if I am to be wrong, I certainly won't be denying someone their personhood while I'm at it.
synths were such a cool concept, which is about the furthest they ever got with the idea. But I never really "got" what they were for, why there was a whole group of people existing to protect them, or what the institute ever made them for in the first place. Or why they were swapping people with synths. I dont know if it was ever stated in the game either. Most of the people replaced are just normal ass people.
Only person they replaced that made sense to replace was mcdonough, and even that one has 2 much better solutions: just pay him a lot of money, have him become corrupt in a way that doesn't reveal your presense/existence you worked so hard to keep secret. Or, make a new synth that will become the perfect mayoral candidate and beat mcdonough in an election, obviously backed by institute money.
It was stated that they were meant to eventually replace humanity. When they became advanced enough. The project was still in ongoing development.
The ones that replaced humans across the commonwealth were done for strategic reasons for intelligence gathering and control of human populations as well as trial runs to evaluate the Gen 3's development and their ability to blend in and mimic humans.
The Broken Mask incident was one such trial run that failed spectacularly.
They were also the workforce and muscle of the Institute above ground against their enemies and the hazards of the wasteland. The scientists never went topside.
There is no "institute money".
`10:20 I see certain parallels with the entities known as Shadows from Counter:Side.
Oh boy, here we go, he’s taking us down the Synth Rabbit hole with him!
You just gonna drop that on us unannounced?)
Radlad
My position was always that they don't. Just to spite the railroad
if you programed ai to act and feel like a human in every way would it be self-aware
You're ignoring the entirely human brain with a synthetic component whose only function is to translate directives and bridge the gap between synthetic and organic hardware and disable the organic brain if the synth deviates from its directives.
Something that wouldn't be necessary unless the synth had the capacity to develop free will.
with a 10 int and chr I choose to belive I as director could slowly change the institute's belief on ai by promoting key people and enlightening stanch opposition. you'd get all the best tech fallout has seen ie perfect teleportation and you can keep the minutemen around establishing a new commonwealth provisional government as a side project once unopposed
Yes they do and we’ve seen many examples in game
Free will is just the ability to preserve thyself, thus, when a synth determines its own life to be a priority, it has free will much in the same way humans do, or any life, the want to preserve your own body at your own discretion
personally, I don't think that Kellog is inside Nicks head cause if that was the case wouldn't he have tried to take over Nicks body what we saw during the strange conversation might have just been the implant messing with Nicks head basically the echoes of a dead man's memories but it would be sick if Kellog came back from beyond the grave one last time question is would he be on the sole survivors side or tried and kill Nate/Nora
*detroit: become human?*
I’m not a huge fan of Fallout 4, but comparing it to that abomination isn’t fair at all.
I don't know how the answer can be anything other than yes. At every turn, we are shown Synths that constantly exhibit or talk about free will. The very premise of this question is flawed as we always see gen 3 synths showing signs of free will. Even the cold and emotionless robotic killer, X6-88, has likes and dislikes and can choose to stop following you. The question of "Do Synths have free will" is a pointless question as the game answers it for you as soon as you meet K1, and the answer is undoubtedly yes
Its such a tough question to answer I would say yes but why i dont know
Then dont answer the question ...
I read the title as "Do synths have free wifi"
Synthetic life is just regular life with extra steps
A little bit longer too
Just mass-produced humans with a slight twist.
Ultimately the debate boils down to: do you have to be human to be a person. The answer should be "no." Ghouls are frequently acknowledged as not completely being human anymore. But they started human. Did they lose their personhood through mutation? How about Strong, Fawks, Tabitha and so on of the super mutants? Are they "not people?" Let's take it a step further. How about Curie or Codsworth? They were never human to begin with, but there is no differentiation between their interactions and another human companions. Dogmeat doesn't have that luxury: Dogmeat has no preference on your actions and will just as happily butcher a city of innocents at your command as a camp of raiders. And yet Curie, synth-ized expresses not really being able to handle emotions, and even before then could approve or disapprove of your actions. We also know this was NOT programmed disapproval either- her programming was altered to develop more naturally. So we know synths have actual emotions. Are they more muted than ours? More sensitive? . . . Does it matter at that point?
Synthes aren't human. Synthes are, in the context of the game, people. Those two things are NOT the same.
Here is the Theme if such a literary argument.
Do. You have free will. Or not?
If science is to be asked, nope. Our Subconscious mind decides before the 'we' we recognize as our individual selves 'wills' something.
If faith then yes, as you were made with the freedom to do and act as you wish.
If you ask philosophy...
It ranges from we are not but illusions of our own minds, to we exist in a dream as the Universe is Mental and all decends from the mind that is of course the all.
To the Universe is made of Intangible tangibles that vary from a predisposed idea or metaconcept of an object say a tree. There are many ways a tree can look but their has to be a fundamental aspect for what is a tree beyond our material realm.
Ask a particular movie and we are in a simulation.
Simply put, no, they don’t have free will. They cannot have it. The problem with synths is that they are always bound by programming, even if that programming is altered intentionally or by damage. If a synth is damaged and its programming is altered so that it believes it a freedom fighter, it cannot then began enslaving other synths. The logic state that allows it to arrive at any conclusion can change, but it is still a binary.
Free will in this context would mean that synths are capable of changing their programming at will, which they do not. And it isn’t hard to see why the Institute wouldn’t give them this ability. If you had a firearm that had an AI, would you give it the ability to change its behavior at its own whim? Not likely, because it may decide that its programming (shoot enemies, protect you) should be changed so that it can shoot you.
What about the two coursers that willingly left the Institute because they were disgusted by and disagreed with the Institute?
What about Dima that developed an entire original personality, goals, desires etc. with zero programming or interaction after a year?
Only Gen 3's run from the institute. Only Gen 3's exhibit fear of the institute. Only Gen 3's exhibit a desire to be free. Only Gen 3's have a reset code to disable them should they deviate from their directives.
They have, at the very least, the potential to develop free will due to the comingling of an organic brain with synthetic components.
I feel so stupid, I read the title as "Do synths have free wifi"
Regardless if they got free will or not, the fact that maxson deliberately got rid of the codex before fallout 4 is stupid, because he then decided that the most scientifically advanced Place in the east coast, should be destroyed because of synths alone.
I'm sure his cybernetic implants are top tier gizmos, but there's gonna be a time when he will need replacements, which he could have easily taken from the institute.
So when maxson inevitably dies from a faulty heart pump wire, due to him hating anything cybernetic/mixed with humans, he will die, and the brotherhood will fracture a part, because they aren't unified in their pursuit of technology anymore.
Maxsons leadership model is basically just 'diet coke' Musolinni, there is no unifying philosophy, it's "the strong takes what it wants, the weak obey" kinda shiet, no codex principles, like, not destroying technology that we barely understand? 😐
For all they know, there could have been enclave influence within the institute, but they didn't even care enough to do a thorough check.
Because, if that is the case, then the enclave will have a much larger army in the making then the brotherhood could ever hope to muster by the time fallout 5 comes, since they can effectively clone an entire army, just like in star wars.
Sounds just like something the enclave would do too, doesn't it?
...oh, Free Will.
I thought the title said, "Do Synths Have Free Wifi"
As for the argument of 'Souls' in Synths, dreams aren't necessarily a metric of the existence of a soul. Now if synths were able to have near death experiences, out of body experiences, or grace given by mediation or prayer, then we can see that synths have souls. However, this becomes then a Metaphysical discussion, and the idea that souls can be created through mechanical processes ignores the sheer complexity of the natural process of life. So much of the physical processes of life require a great deal of fine tuning complexion, and since a soul is a metaphysical substance, entire unknowable to the physical, bar from its effects, then we know that there is potentially an even greater level of a complexity required into creating a soul, seen from its relation to that which it inhabits, and unlike the physical process of creation to which we can measure, the metaphysical is unmeasurable, thus it is practically impossible to know the process of creating souls, thus synths most likely have no souls.
i'm surprized that Father doesn't have a personal synth
Does my toaster have free will?
Does your toaster have personality module from Big MT?
@@FlymanMS sadly no
Great video, and it got me thinking. Are the synths as much of a threat as the Brotherhood perceives them to be? I am of the mind that they maybe are. The institute is very misguided in their goals, but the one driving factor is the constant improvements to the Synths. They have a whole division dedicated to it. Already in Fallout 4 the synths are slightly better than humans, smarter, faster, ect. If they continue to endlessly improve what happens when one synth is worth five humans or ten? The "slavery" of the synths is a forgone conclusion because eventually they'll be so much smarter and stronger than the humans that they will overthrow their institute masters. From there the synths will ramp up the production of their kind and a full scale synth invasion of the surface will begin. The robots could gas the commonwealth and wipe out the survivors on the surface. They could teleport anywhere and attack any target. So destroying the institute along side the brotherhood is the best outcome for hummanity.
The synth components in their brains that allow them to be shut down with a few words. The fact that they would never be improved past a certain point.
The fact that the "faster, smarter, stronger" synths are all coursers that also have those same components and enforce the will of the human scientists.
Synths aren't the threat. It's the Humans that run the Institute.
The Brotherhood is extremely racist and xenophobic as well and they view anything that isn't human as an abomination to be destroyed. That includes non feral ghouls and non hostile synths.
The Railroad might as well value a toaster above human life. They're self righteous fanatics who mind wipe the synths they save which is basically akin to killing them and then irresponsibly tossing them out into the world to potentially wreak havoc. Absolving themselves of all responsibility after that point.
Death is a mercy
I think the perennial rampant hubris of the Institute makes them not question their beliefs as much as they should. The funny thing is that, in science at least, questioning and changing one's beliefs is something that happens on a regular basis. If the question is raised that a synth could be capable of free will and independent thought, then why wasn't it ever investigated? The answer could be that they didn't want to know because they benefitted from the assumption that it wasn't. I've seen intellectuals pull that trick on themselves a few times...though, once again, never scientists. Maybe a BA with a predilection toward arrogance. I mean if a scientist is evil and knows what he is doing is wrong but he still keeps doing it or believes it will help everyone in the long run, ... that's one thing. Yet a scientist that would never question what he is doing but believes something but didn't bother to prove it one way or another yet still believes what he is doing is right!? Yada yada yada. That's not a "Scientist". That's bad writing.
Okay so, the brain still needs energy and sustenance to survive. How do the brainbots supply that to their brains?
Did we ever find out where all of the institutes crazy technology came from? I may have just missed that bit of lore.
Pre war tech improved upon for over 200 years without legal oversight or red tape.
Advanced Robotics, bioengineering, DNA manipulation and FEV existed long before the war and the Institute was formed by the surviving scientists and descendants of the Commonwealth Institute of Technology.
Coursers are volunteers iirc
Ngl when I first saw the title I thought it said free wifi
I think this is asking the wrong question. A better question would be "Do Synths Have Personhood?" and i would say yes, they very much do.
why did I read this as Do synths have free wifi
Good points
thought the title said do synths have free wifi
NEW YABOIII VIDEO
synths be synths man.
But there aren't 'different courses of action.' You can imagine making a different choice, and you can imagine the universe being different because of that choice, but that alternate universe isn't real. There is only one outcome. Choices are just an active imagination.
I think they think they do, but I don't think they do. Just because they can make decisions doesn't mean they're not programmed to be the way they are. They're machines that think they're alive. They're just machines, running lines of code, and calculating things at a rapid pace.
A machine's purpose is to serve humanity, thinking or not. A machine cannot be a slave because the concept of slavery only applies to humans.
To be honest I am not sure if Syths have free will or not because I don't even know if humans themselves have it or mearly think we do it's hard to tell if there even is a difference between having free will and thinking we have free will
Edit this is baring ones like gens ones and most gen 2s
I’m not really interested in the question of synth ”free will”, mostly because I don’t think it is a thing in the real world. I don’t think people who debate our free will has defined what the word is supposed to mean.
Robots in fallout can obviously display behaviour that goes against the intentions of their programmers. That is called buggy coding, it’s nothing special.
The question we should be asking is if these things are sentient. Do they have an internal subjective experience of being switched on? Or are they nothing but a bunch of decision trees, including a limited set of prewritten verbal responses, that is triggered automatically by their programming in response to external stimuli? If you could switch places with Codsworth, would it feel like something to be him, or would it just be darkness?
As they are now, no, I don’t think this is free will. They are machines built for a specific purpose, and meant to be autonomous. Despite this, I also believe they deserve a level of respect for their programing in regards to how they accomplish their goal.
And the fact that only Gen 3's run from the institute and display abject fear of the institute and a desire to be free?
That's emergent human behaviour from the intermingling of synthetic and organic neurological components. It's not a glitch.
Gen 3's are the only ones that have a reset code to reset and override their minds. That wouldn't be necessary if they didn't have the capacity to develop free will and actual sapience.
Not saying all of them are sapient. I'm saying all of them have the potential for it to happen.
@@VaultArchive72 I get your point, there is a capacity for a higher level of processing power. Despite this the ai of today can mimic human speech, and my PC has factory reset. Ultimately I think it comes down to specifically what a "synth component" actually is.
4:51 What about Danse?
The perspective of the Institute on "free will" of a synth is rather forced or not natural. It feels like the Institute just want to turn away from the possibility or evidences present throughout the game or in their experiment logs. From the stand of the Institute, slavery is not "wrong" as ethics simply do not exist nor matter "in the name of science." The institute considers the "outer world" as "inferior" and expendable just like Enclave and even casually kills or replaces human lives. What's so important even if synths have free-will?
I hate synths. I had no regret in destroying the synth shaun lol
Omg dude I hate synths sooo much. I hate the whole robotic humans cliche
Here's my belief... question yourself, see how advanced the games A.I is? Although repetitive they have their own lives so, was it worth killing that one raider who was probably starving? Or that one super mutant trying to survive? Swan??? The conversation with synths always goes back to character and player morality, but why not with other characters? Like hotline miami says "do you enjoy hurting people?"
I’m of the opinion they do have free will, I don’t consider them human, because they don’t have to deal with hunger, age or even really pain if they do them up right, so while they’re stronger then humans in a physical sense and even the mental sense, I consider them lesser then humanity and fundamentally inhuman in a philosophical sense.
Basically I consider them, and indeed most Androids and robots in games or in books etc. inhuman and less then human, simply for the fact they don’t have to deal with simple things like hunger, age or disease, and the fact they never earned their strength or their intelligence, they were just made with it. But, they do possess free will.
No
Better question: Do humans?