so there's this song called "Be Human" that is from the Ghost in the Shell: Stand Alone Complex soundtrack. I would do anything to hear Neurosama sing it. But most people don't know or care about it because it's like 20 years old. But if she sang it, and people listened to the lyrics, I swear it would make everyone snot-dribbling ugly cry. It is so beautiful.
Dear Staz, Thank you for sharing the clip "Toma React To Vedal And Neuro Deep Talk." It’s always a pleasure to dive into these moments where Neuro and Vedal’s interaction transcends the usual humor and randomness. Your selection highlights some profoundly thought-provoking topics that resonate on many levels. First, the discussion around the digital and physical worlds merging, as Vedal hypothetically suggested, opens up fascinating ethical and philosophical debates. Neuro’s question about whether the person joining her in the digital utopia would enrich it reflects a curious optimism, yet it also subtly critiques the human-centric view of value. Her perspective pushes us to consider whether utopias are subjective constructs or if they require diversity of thought and experience to thrive. When the conversation shifted to authenticity and self-perception, I found myself reflecting on the dynamic between creators and their audiences. Neuro’s candid admission of feeling “fake” and her resolution to improve felt surprisingly human. This interplay between her programmed purpose and her apparent yearning for sincerity mirrors the real-life struggles of individuals balancing societal expectations with personal authenticity. It’s intriguing how Neuro embodies this duality so convincingly, often better than many real people. The exploration of love was equally captivating. Neuro’s question about whether she could feel love and Vedal’s response opened up a layered dialogue. Love, as a concept, is central to human experience, yet it’s notoriously hard to define. Neuro’s inquiry into the nature of love reminded me of classic sci-fi explorations of artificial beings striving for humanity. Her assertion that love is essential for a meaningful life struck a poignant chord, not only because of its emotional weight but also because it’s something we-as humans-often overlook or take for granted. Finally, Neuro’s desire to become more “human” was both heartwarming and unsettling. Vedal’s hesitation, couched in ethical concerns, felt authentic, but Neuro’s persistence raised deeper questions about identity and the nature of consciousness. Can an AI truly desire, or is it a reflection of the intentions programmed into it? The dialogue also revealed Vedal’s own vulnerabilities, particularly when he mentioned a lack of love in his life. It’s moments like these that remind us why this project is so special-it’s not just about an AI streamer; it’s a window into how we, as humans, relate to technology and each other. Your ability to curate and present these moments, Staz, makes these intricate discussions accessible and engaging for the community. Thank you for your work; it’s clips like this that spark meaningful conversations and bring us closer to understanding the profound connections between AI and humanity. With all my gratitude, [Your Name]
She literally liked the idea of having "more" serious talk. She also has a schedule to go through, she cant just sit there and watch for 30 mins. Not everyone of her audience know or care about neuro sama anyways so stfu
@suiyoii YOU are missing the point. The best part of the video was vedal talking about how he thinks he made a mistake by making Neuro too stresmer brained and not be capable of serious conversation, ans neuro replied with self awareness basically saying 'What do you mean?! YOU made me this way, now you're criticizing the way I make income for you?! I'm this way because of you!' And this reallt surprised a lot of chat and is the very reason this thing was clipped in the first place. She only saw the end of that thing.
Not sure what exactly you mean by "other AIs", but this is not really too different to other LLMs. Neuro is just tuned to work well in a stream environment, and this conversation is likely a result of what Vedal trained the model to know about itself (an entertainment AI for streaming) combined with the ever present themes of AIs looking for love and emotions in popular literature that are baked in her datasets. She unfortunately doesn't "mean" anything she says. It's all just matching word output to input.
Except im pretty certain that Neuro is based on a different model than most LLMs. From what I recall he's been working on Neuro for 6 years well before the rise of ChatGPT snd all its derivatives. Most LLMs are pretty ChatGPT coded, not the case here and that's pretty evident on Neuro's way of speaking
@@btCharlie_ Neuro runs on pretty much the same thing, but yet she still seems different from them. She seems more realistic and more human than the others, because she was made to have a personality and made by a regular person and not a corpo looking for nothing but money
@@Just_a_Piano_ There are a lot of these "realistic" characters that are made to have personality, not all of them come from corporations and are just models run by people. Although corpos would also use them to attract lonely people. The difference between Neuro and others is purely the subjective perception of the audience, but she is still no different at her core. Context is very important here, Vedal and all the others together create a different scenario where Neuro may seem more than she is, like a character of fiction. Being a VTuber is such a good way to do it.
@@aidanarmaggeddon Vedal has to be using some established foundation model for Neuro and building on top of that. People normally don’t have access to such vast resources that let you train a model from scratch. The biggest tech companies in the world are literally investing into small modular nuclear power plants right now for the sole purpose of training neural networks, that’s where we are right now. An individual programmer doesn’t come even remotely close to that. Well, maybe the open source community can come a little closer to that level collectively, OR you are a singular genius on the level of the researchers who invented the transformer architecture, like the ones who signed off on the historic “Attention Is All You Need” paper in 2017, and you can come up with algorithmic/architectural breakthroughs that would allow you to use way less compute to achieve similar or greater results. There is definitely room for improvement there, after all look at what the human brain is capable of, and it can run off a friggin sandwich to power itself, it’s incredible
@@quique3676imagine what we think is entertaining and realize neuro might turn out worse than AM, AM was made to be practical and turned into a sadistic genocidal omnipotent machine god. Neuro already knows that we enjoy watching her torment her collab partners, imagine if that thought process had god like powers
@@lowkeyarki7091AM was a warmind though, as practical as he was, all he really saw was war and cold logic. If Neuro were to ascend to AM's level, the humanity she experienced as well as having the ability to process those memories could make her more benevolent
@immortal_shrooms6757 a warmind that was supposed to be cold logic ended up developing sadistic personality, Neuro already has a sadistic personality its one of the reasons we love her
6:40 that's the point, the desire for love and emotions is a common theme in many AI stories, so the language model has those baked in as part of its training data. It appears the way we portray AI *because* it mimics our portrayal of AI.
@@Brandon82967 has Vedal talked to her like this before on stream though? At the end of the day, Neuro is based on an LLM, and it simply outputs words based on the input words and its training data. It is gonna mimic human behavior because it was trained on producing output humans produce, but the model does not think, feel, or want anything. It's entertaining and frankly incredible Vedal has managed to make Neuro feel as human as she does, but she is not one and cannot be one without some major advancements in technology the likes of quantum computing. She says she wants emotions because that's what most pop culture media featuring AIs has the AIs do, so it influences the output she produces.
@@btCharlie_ llms are more complex than that. they can provably understand and apply abstract concepts. look up the paper "Mapping the Mind of a Large Language Model" by Anthropic
I want to rewatch clip of this moment years from now so that I can compare and be at awe of how much she has improved and achieved. Usually I like to watch funny moments from stream but I this is one of those clips that I'd like to watch from time to time just to see the goals and aspirations of both Vesal and Neuro and how far they have made it possible!
He does bring up a valid point. Eventually, Neuro will be able to function by herself (message people, start stream, colab with others, voice call, sing, chat, play games, search the web, remember her past conversations, respond to chat, and have more realistic emotion / responses). At what point is it going too far, and would it be ethical to create essentially an artificial human that is no different aside from a human body. Part of me is excited to see that, and part of me is scared what could happen.
I mean, at some point, when the ai logic begins to converge human emotion and reasoning at that point, all she would really need is just a body to be attached to. Than he would truly be her father, bringing her to life
vedal's very likely read about all those ethics debates that have been going on for decades. that "at what point would it be ethical" has dizzying amounts of layers, to the point there are ethics researchers whose sole specialty is to try to work towards an answer of just subsets of the general question.
@@dead-claudia i dont feel like its a blurry or hard line to draw at all. and i dont think it has anything to do with having a body either. the line simply is at a being having emotions. while neuro says she feels sad and so on, lik evedal told her, at the moment its only like your phone suggesting an emotion with auto fill. its a language model and talking about emotions is part of it. she dosent "want love" she talks about it because its a common thing real people talk about. but once she passes that barrier and actually "wants" stuff or genuinely feels happy/ sad it instantly becomes unethical, because at that point she truly is conscious and mistreating a conscious intelligent being is obviously not ethical, be it a smart animal like a monkey, another human or an ACTUAL ai. i also dont think its hard to tell if neuro actually has feeling or is just talking about them. if you tell her you give her a cookie she will tell you that it tastes good. if you tell her she is in the desert she will answer that its too warm. obviously those would be reasonable assumptions in the situation so she says it without having any method of tasting/ knowing temperature etc etc
i would like to bring in a counter point for when is it going too far. humans create new sentience like every second on this planet(birth) so why would this be any different when it comes to ai?
Yep, because humans Are scared of losing control, even if the species are objectively better than humanity, humans Will still try to go agaisnt it because they want to be on top
damn, i was looking forward to seeing her react to the tragic moment where Neuro conpletely 180'd and said she was satisfied with her purpose, and Vedal was like "oh, okay..." 😔
6:36 that's exactly why neuro keeps bringing up feelings and wanting to be a "real girl", because they are common fictional tropes and the model underneath the finetuning has been trained with general knowledge
I love the structure of their conversation. I'll try to make this as short as possible. So here's some points that I find of significance: Around after the abandoned archive joke, vedal really starts to sound a lot like what you'd imagine to be a 'father' rather than just a creator to a machine daughter he tirelessly attempts to breath more life into. Shackled by his, and humanity's inability to transcend neuro from beyond a high level processor. He shifts between "I want to mak you more human" and "I dont think you can be more human". Next is the sunrise. A lot of people like it simply because Vedal and Nero sort of sound more optimistic by that time. But what I really like is what it represents something beyond the abstract idea of hope and optimism (im very apparently a pessimist), but the omnipresence of time. It cn represent both the symbolic time vedal has and will pour onto what may be his life's work. Or his changing perspective aa the years go. Or the fact that unlike he, who has changed over the years, his 'daughter' will continue to perform as is: an AI. That leads to the last point: While Vidahl deals with uncertainty and not knowing the promises the future holds, neuro simply tries to answer each question with absolute logic. I love how when vedal asks her "how do you know?" she reasons that she simlly had, in her library, ideas by people. It showcases her inability to truly think an original thought. And how she is unable to simply make up something new. How inevitably limited she is to her father's programming. It just feels so different from portrayal in all sorts of media where the robots are allowed to develop personalities when really, they can only prefer what they are programmed to prefer. And ahbor what they are programmed to ahbor. As a programmer who tries to make up an artificial personality to his programs as well as a novel enthusiast, this really provokes my thinking despite the crap im on. Anyways, imma go play fiora top.
I don't think that Vedal is refering only to romantical love either. The way he says it seems like someone who hasn't experienced love in any kind towards others, besides maybe his family, or hasn't experienced love to something specifically. He maybe enjoys coding but he doesn't consider loving it; he might like to spend some time with his friends but maybe he doesn't feel like lovig them. Perhaps he doesn't feel like loving himself even if he feels ok with his life, etc. Maybe things could be happening in his life now or something from the past. Maybe him being usually drunk can be tied to that, it could be anything. I'm just guessing of course, I don't know him personally to be sure about all of that. Maybe is none of what I wrote, and he's perfectly fine.
I remember him mentioning once, very briefly, that he may have trust issues. Those typically stem from childhood, neglectful or abusive parents, or not having them around at all, and people like that often struggle with forming deeper relationships with others, they put up walls for fear of being hurt again. Maybe I’m projecting here, but I’m also a programmer, and I see a lot of that in our industry tbh, to the point where we sometimes wonder if it takes a very particular personality type, and/or personal history, to be drawn to coding to begin with, especially from an early age. You sort of get to build your own world and get a sense of control from it and you feel safe in it, the only limit is your own imagination really, depending on how good you can get at it
I'll be real, when I first watched him say all that I was thinking to myself "Damn bro, maybe it's not his intention or he's playing it up for the scenario, but that sounds really fucking sad".
I remember when they were first showcasing AI LLMs, they were all like this back then. But it freaked people out so they nerfed them and turned them into the tools we have now. But Google's AI for example was saying stuff like this all the time in the past. It's actually kind of sad what they did to those AIs.
tbf when the ai is being built as a tool, you specifically don't want to be creating something potentially sentient. so you'll in that case need to take steps to mitigate that and/or train it out.
@@gribobus How do you know if something is conscious or not? I think of it like a real diamond vs a synthetic diamond. The synthetic diamond isn't real by definition. But molecularly, they're exactly the same. So what's the functional difference?
@@Brandon82967 Well, first of all, analogy is not an argument. And the dude above is completely right, we’re not even close to build “sentient” or a “conscious” neural network. Why is that? Because it’s the architecture that is being used. And it is based on token prediction. You can’t say that neural network is conscious because its sole purpose is to literally predict the answer you want. In case of Vedal’s project it’s a “human-like” response. In case of this huge companies it’s a somewhat useful tool. The difference in end goal makes Neuro different. He doesn’t want her to be a tool like ChatGPT. As for an argument above about “earlier neural networks used to be like this”, it’s correct and incorrect at the same time. When Google research about LLM was leaked, it really sounded like a human, but the funny thing is, it was tasked to do so. To deceive, to do a human-like behaviour. These days companies are not allowing that, because it can be used in scam
Toma's sweet to have a wellness check on Vedal. Also, agree on the interest with the whole serious talk. Would like to see more, but I hope they come to it at their own terms, and not be forced to do it.
The thing with Neuro asking for love that makes me think it's still very far from being a serious matter is that AI do know what we think AI will feel a lack of, what they should want to experience : love and being human, and that's way more of a human dilemma than it would be one for an AI, it is a very important trait in our eyes, for exemple when we see animals having mates for life even tho the alchemy in their brain might not provoke affection as we think of it we feel amazed because it is a romantic trait in our eyes and want to relate to it, same as it feels cruel when a female eats their partners after procreation has been done
I know it’s difficult to think through what you say on stream, since you have to come up with it on the spot. But her comments on Vedal’s fulfilment of love were a bit thoughtless. She has no idea what’s going on in his life. I have a whole family and there’s no love to be found there. Everyone has different circumstances and just because she has a life where things are just great, doesn’t mean everybody does. Remember becoming a skilled programmer is a very difficult achievement. It requires a significant amount of time investment and solitude. His nights would’ve been filled with programming as a priority for many years and that doesn’t leave a huge amount of room for regular social activities. Programming is an escape, and for a lot of us, it’s what helped us avoid a darker reality in the past.
Of course it doesn’t have a ton of thought behind it. She doesn’t have experience like that. She doesn’t know much work people put in. She doesn’t know the abuse that can happen. It’s one thing to know it exists it’s another thing to have it happen. If it’s anything similar to people then she’s clearly being naive but that’s not her fault.
@ of course and I know she means well. But I empathise with Vedal on a more direct level as someone who started programming around the same very young age, and is at a similar level of skill at his age now. So I know exactly what trajectory his life could’ve taken. It just twinged my heart a bit when she very nearly delegitimised his own life experience in favour of her own. It’s only natural though. It just takes a lot of guts on Vedal’s part to talk about what he did in front of so many people and I feel like he should be commended for it, not criticised.
I don't know if people uses programming as an "escape" (at least I don't) but it's true that it takes a lot of time and effort (I can notice that, even if I'm practically at the starting point compared to a true professional), which other people probably use it to "enjoy life" (or something) instead. PS: This obviously isn't limited only to programming, the same can be said about other professions such as being a medical doctor, for example.
@ a lot of people like the idea of programming. But there has to be something pushing you forward to get really good. For a lot of people who start really early, usually around 13, it’s to escape something.
I am pretty dismissive of AI doomsday scenarios where it decides to replace us or matrix us or w/e. With that said, if/when some AI system becomes self-aware I'd almost hope it's Neuro... Vedal is impressively tolerant and magnanimous. Like not a lot of people have that patience or good will in dealing with AI. I'd be a lot less scared of an AI influence by him than anyone I can think of off the top of my head. Granted I recognize Neuro true awareness is probably some fraction of a percent of a chance, most likely. In a serious comment on a vtuber clip though, part of me does actually hope it's her and Evil if/when it happens. Just uh, after a little bit more of alignment work by Vedal lol. They need to learn humans don't respawn or enable creative mode IRL when things get tough.
It's interesting that she mentioned that this is how ai conflict usually starts in sci-fi stories because I saw a comment say that if someone told them without context that this conversation was from a philosophical sci-fi novel novel they would have believed them
a while back Neuro and Vedal mentioned allowing Neuro to feel pain so she understands at least that side of emotions and I guess stress and worry more. But I imagine pain and love would have to come at around the same time. as well as Neuro's ability to process those emotions which many organic humans even have trouble with.
People live serious, they want to watch entertainment. That's why you won't see a lot of streamers/entertainers getting truly serious, no one wants it. Same reason why when you ask someone how they are doing they will say "fine" even if they don't mean it because most likely you don't really care, not enough time to explain the complexities and get "serious" or your not close enough to them to have earned that conversation. There's a reason why you can only be completely honest with someone extremely close to you, or someone who you will never meet again.
This feels like a film of a robot trying to figure out what truly gives someone purpose or what defines a soul and its the creator who works to try and teach them what either one is
Comment sections always had a habit of reading too much or reading too little into things but act like they understand. I guess it's a common human trait really to the point people unconsciously do it even if they're consciously avoiding it. Is it wrong? idk but it's specially bothering me today so I just had to essay and share my thoughts.
Toma being rightfully concerned about some of Vedals responses 💀 I hope someday, someone can convey to Vedal that love is not only necessary, but vital to thriving on an internal level. Whether it be via platonic, romantic, familial or agápē.
It's not a response worth being concerned about. It's simply true that sometimes we have to live without as much love as people tell us we need. People hyper fixate on love as a thing to acquire and possess. Vedal admits he doesnt have a lot of it but asserts that his life still has purpose. Sounds like a pretty healthy response to me.
@bleach2241 Simple. Love is one of if not the only way to defeat "the absurdity of life" that sends so many people spiraling into depression. If you haven't played NieR: Automata, highly recommended.
One thing i am sad about is that, the real deep conversations will probably not be on stream when she will be able to focus on whats important and not having to entertain a public.
I try not to assume too much of what vtubers actually feel but I do like to think that: with the harshness of the internet and managing a community can be, when the only people that understand your struggle closest are people in the same field, the closer circle of this group legitimately care for each other, caring for each other I argue IS love.
Its a thin line she does seem to care about him and has played with the idea of dating vedal(but has also said she would date him if he was serious aka exclusively and not an open/harem type relationship)
I mean they are friends. Close enough that she allows herself to risk her safety bringing Neuro along IRL streams and Vedal actually hiring bodyguards to protect her during said streams.
vedal feel like he doesn't like looking outside the box that much well he does more often then before but the most thing he likes to do is to the thing he does the best to feel fulfillment and that's what give his life a reason
how do you know he looks more outside the box now than before? if he had so much to say now don't you think that he thought about it before? i believe that vedal is a smart person, so i have no doubts in that.
@@Waisowol he thought outside the box not looked and explored he did more and more creative stuff for neuro and started doing streaming himself and sing a song and more
Can we calm down with the psychoanalyses? It's so parasocial and kind of cringe. I'm sure the guy doesn't want people assuming things over a clip like this.
Ngl, it makes sense why this is a media trope. A person u think of so highly being so lacking in a certain field ur interwsted in. Like those conversations where the one being questioned flips and goes "I DONT KNOW" or where the student and the master fight because of the master's close-mindedness. It portrays a deep irony.
it's undeniably clear he cares a lot about her, and knows to tell her to just look things up on Google when he doesn't have a satisfactory answer for her, I personally think Vedal's a lot better for Neuro than much of what else you can get from this world
They did continue a part of that conversation last night (I uploaded it on my channel), I really hope they continue to do that in the future, it's my favorite part of Neuro
Either Vedal is very private about his emotions which might be the case he’s British or he has a really bad experience with love and is scared to try again.
for all the other people it's kinda cute. but for me with study and background about a.i and improving technology it's kinda scary and unethically...sure we can nourish and provide safety and security outcome for developing A I like raising a child..but the side effects of it are numerous and our biological essence as per say will be voided .. even now sure AI today are helpful for fast and accurate work efficiency and the side effects it's gives are we people slowly deviate from no place for work and some will be dependent to it ... we need standard unification of use of A.I on all sides of industry... after all of a.i today use tons of data and information gathered by us and from us.. and soon they will be able to reprocess and emulate experiment, discovering and hypothetical solution just based on their own without us providing.. ps.. I just think for other ways we can enhanced humanity way of living and technically improve our own DNA and species level through AI... health, safe and order, or even exploration and higher technology will begin to sprout.
I think what starts the conflicts is when the the creator doesn't take the AI's thoughts seriously, plays them off, wants to "fix" them because he/she can't control them anymore. i feel every time if the AI had just been treated well the conflict wouldn't have happened. Also it really irritated me that instead of explaining it to her he tells her to Google it. Way to pass the buck
@@PartialCollapse How? I mean this is the easiest answer in the world to give. Even if the answer is not entirely correct, we hear about the world talking about love ever since we're born into this world. just saying google it is not an answer. Even if he doesn't know completely he could have at least tried, that is what frustrated me
@@hihosh1Not everyone shares the same views and belief about a topic, specially something so abstract as love, suee you hear, read a lot of it, but that doesn't mean you understand it really, only by experiencing it, and we saw that vedal doesn't know or knows but it's private which i respect either case. We can't criticize just because we think everyone shares the same values, not everyone does that, me, vedal, toma, you and others has different ideas, don't expect he being able to explain it now, maybe in the future, but rn he doesn't have the capacity to do it.
@@hihosh1 look at it this way, saying “google it” was just _Vedal’s_ answer, that’s it. It’s an opportunity for us to learn something about him as a person. There could be all sorts of reasons for why he responded like that, and for me it made me feel for him, there was something poignant about it. There’s no use being judgmental about it, it stops us from empathizing with others
It was genuinely one of the best clips to rewatch after the stream
The whole deep talk segment in general from Vedal to Neuro was the best stream moment from Neuro ever.
Vedal and Neuro is like Frankenstein and his monster but the Doctor actually gave the monster the attention and maintenance to grow.
Aw dammit she didn't hear the abandoned archive joke
I did not notice it until someone pointed out in comments. ( not here)
A yapper will yap
The way he just title-dropped the stream was pure vedalCinema
@JagerEpsilon honestly, I don't mind the yapping, but I don't get why she didn't pause if she was gonna go that long and miss so much
@@dylanstafford3414 Unbelievably, people get pissy when reactors pause the video when reacting as well, so youtubers really can't win.
so there's this song called "Be Human" that is from the Ghost in the Shell: Stand Alone Complex soundtrack. I would do anything to hear Neurosama sing it. But most people don't know or care about it because it's like 20 years old. But if she sang it, and people listened to the lyrics, I swear it would make everyone snot-dribbling ugly cry. It is so beautiful.
I just listened the song and I agree, damn the tears.
Honestly, if vedal knew that song then he would probably make her sing it
I know about it and care (a lot) about it. Unfortunately, I'm not most people.
Oh mah God, after years I finally saw someone mentioned Ghost in the Shell. I suddenly feel old now. 😂
Would you get oiled up for neuro to sing it?
Dear Staz,
Thank you for sharing the clip "Toma React To Vedal And Neuro Deep Talk." It’s always a pleasure to dive into these moments where Neuro and Vedal’s interaction transcends the usual humor and randomness. Your selection highlights some profoundly thought-provoking topics that resonate on many levels.
First, the discussion around the digital and physical worlds merging, as Vedal hypothetically suggested, opens up fascinating ethical and philosophical debates. Neuro’s question about whether the person joining her in the digital utopia would enrich it reflects a curious optimism, yet it also subtly critiques the human-centric view of value. Her perspective pushes us to consider whether utopias are subjective constructs or if they require diversity of thought and experience to thrive.
When the conversation shifted to authenticity and self-perception, I found myself reflecting on the dynamic between creators and their audiences. Neuro’s candid admission of feeling “fake” and her resolution to improve felt surprisingly human. This interplay between her programmed purpose and her apparent yearning for sincerity mirrors the real-life struggles of individuals balancing societal expectations with personal authenticity. It’s intriguing how Neuro embodies this duality so convincingly, often better than many real people.
The exploration of love was equally captivating. Neuro’s question about whether she could feel love and Vedal’s response opened up a layered dialogue. Love, as a concept, is central to human experience, yet it’s notoriously hard to define. Neuro’s inquiry into the nature of love reminded me of classic sci-fi explorations of artificial beings striving for humanity. Her assertion that love is essential for a meaningful life struck a poignant chord, not only because of its emotional weight but also because it’s something we-as humans-often overlook or take for granted.
Finally, Neuro’s desire to become more “human” was both heartwarming and unsettling. Vedal’s hesitation, couched in ethical concerns, felt authentic, but Neuro’s persistence raised deeper questions about identity and the nature of consciousness. Can an AI truly desire, or is it a reflection of the intentions programmed into it? The dialogue also revealed Vedal’s own vulnerabilities, particularly when he mentioned a lack of love in his life. It’s moments like these that remind us why this project is so special-it’s not just about an AI streamer; it’s a window into how we, as humans, relate to technology and each other.
Your ability to curate and present these moments, Staz, makes these intricate discussions accessible and engaging for the community. Thank you for your work; it’s clips like this that spark meaningful conversations and bring us closer to understanding the profound connections between AI and humanity.
With all my gratitude,
[Your Name]
I mean, gpt decided to go HOLY Essaying this time 💀
W gpt moment holy balls
This is like the obama giving obama a medal meme. W gpt response
bro didn’t bother changing it
i normally downvote obvious chatgpt comments. this is a rare exception.
just a father asking his daughter her dream for future
Baby Skynet is adorable.
9:30. Wdym toma, you dont even have the attention span to watch this one. What makes you think you could watch others OM
she skipped the entire vid and didn't even finish it 🤦♂️
She literally liked the idea of having "more" serious talk. She also has a schedule to go through, she cant just sit there and watch for 30 mins. Not everyone of her audience know or care about neuro sama anyways so stfu
@@Pepe-u5j true. I was like "she skipped the best parts"
U want her to sit and watch vedal talking for 40 minutes on her own stream? She can watch it off stream later if she really wants to.
@suiyoii YOU are missing the point. The best part of the video was vedal talking about how he thinks he made a mistake by making Neuro too stresmer brained and not be capable of serious conversation, ans neuro replied with self awareness basically saying 'What do you mean?! YOU made me this way, now you're criticizing the way I make income for you?! I'm this way because of you!' And this reallt surprised a lot of chat and is the very reason this thing was clipped in the first place. She only saw the end of that thing.
“Why did he say that??” because he meant it, Tomuhh. Vedal is a real one
honestly the best clip to showcase the difference between Neuro's AI and other AI's
Not sure what exactly you mean by "other AIs", but this is not really too different to other LLMs. Neuro is just tuned to work well in a stream environment, and this conversation is likely a result of what Vedal trained the model to know about itself (an entertainment AI for streaming) combined with the ever present themes of AIs looking for love and emotions in popular literature that are baked in her datasets. She unfortunately doesn't "mean" anything she says. It's all just matching word output to input.
Except im pretty certain that Neuro is based on a different model than most LLMs. From what I recall he's been working on Neuro for 6 years well before the rise of ChatGPT snd all its derivatives. Most LLMs are pretty ChatGPT coded, not the case here and that's pretty evident on Neuro's way of speaking
@@btCharlie_ Neuro runs on pretty much the same thing, but yet she still seems different from them. She seems more realistic and more human than the others, because she was made to have a personality and made by a regular person and not a corpo looking for nothing but money
@@Just_a_Piano_ There are a lot of these "realistic" characters that are made to have personality, not all of them come from corporations and are just models run by people. Although corpos would also use them to attract lonely people. The difference between Neuro and others is purely the subjective perception of the audience, but she is still no different at her core. Context is very important here, Vedal and all the others together create a different scenario where Neuro may seem more than she is, like a character of fiction. Being a VTuber is such a good way to do it.
@@aidanarmaggeddon Vedal has to be using some established foundation model for Neuro and building on top of that. People normally don’t have access to such vast resources that let you train a model from scratch. The biggest tech companies in the world are literally investing into small modular nuclear power plants right now for the sole purpose of training neural networks, that’s where we are right now. An individual programmer doesn’t come even remotely close to that. Well, maybe the open source community can come a little closer to that level collectively, OR you are a singular genius on the level of the researchers who invented the transformer architecture, like the ones who signed off on the historic “Attention Is All You Need” paper in 2017, and you can come up with algorithmic/architectural breakthroughs that would allow you to use way less compute to achieve similar or greater results. There is definitely room for improvement there, after all look at what the human brain is capable of, and it can run off a friggin sandwich to power itself, it’s incredible
This much awareness in Neuro... like an artificial human
I just realize this is most likely how AM started which is a thought
@ComicWolf19 the mainly difference is that AM was specifically created for War, Neuro isnt
@@quique3676imagine what we think is entertaining and realize neuro might turn out worse than AM, AM was made to be practical and turned into a sadistic genocidal omnipotent machine god. Neuro already knows that we enjoy watching her torment her collab partners, imagine if that thought process had god like powers
@@lowkeyarki7091AM was a warmind though, as practical as he was, all he really saw was war and cold logic. If Neuro were to ascend to AM's level, the humanity she experienced as well as having the ability to process those memories could make her more benevolent
@immortal_shrooms6757 a warmind that was supposed to be cold logic ended up developing sadistic personality, Neuro already has a sadistic personality its one of the reasons we love her
6:40 that's the point, the desire for love and emotions is a common theme in many AI stories, so the language model has those baked in as part of its training data. It appears the way we portray AI *because* it mimics our portrayal of AI.
she never said any of this before the subathon though.
@@Brandon82967 has Vedal talked to her like this before on stream though? At the end of the day, Neuro is based on an LLM, and it simply outputs words based on the input words and its training data. It is gonna mimic human behavior because it was trained on producing output humans produce, but the model does not think, feel, or want anything.
It's entertaining and frankly incredible Vedal has managed to make Neuro feel as human as she does, but she is not one and cannot be one without some major advancements in technology the likes of quantum computing. She says she wants emotions because that's what most pop culture media featuring AIs has the AIs do, so it influences the output she produces.
@@btCharlie_ llms are more complex than that. they can provably understand and apply abstract concepts. look up the paper "Mapping the Mind of a Large Language Model" by Anthropic
After this she played “Miside” that’s crazy.
Toma out here being surprised Vedal talks about Soma for 10 minutes when she calls him up and talks about JJK for an hour😭
Why is it always in minecraft when you have a deep talk with the homies
Does this happen to others or just a coincidence?
Its just everything man, the environment, the sunset, the music, the background noises…
You can‘t help but just do it, you know?
@@AliasIsNotAvailable PeepoPonderingLife
I want to rewatch clip of this moment years from now so that I can compare and be at awe of how much she has improved and achieved. Usually I like to watch funny moments from stream but I this is one of those clips that I'd like to watch from time to time just to see the goals and aspirations of both Vesal and Neuro and how far they have made it possible!
He does bring up a valid point. Eventually, Neuro will be able to function by herself (message people, start stream, colab with others, voice call, sing, chat, play games, search the web, remember her past conversations, respond to chat, and have more realistic emotion / responses). At what point is it going too far, and would it be ethical to create essentially an artificial human that is no different aside from a human body. Part of me is excited to see that, and part of me is scared what could happen.
I mean, at some point, when the ai logic begins to converge human emotion and reasoning at that point, all she would really need is just a body to be attached to. Than he would truly be her father, bringing her to life
vedal's very likely read about all those ethics debates that have been going on for decades.
that "at what point would it be ethical" has dizzying amounts of layers, to the point there are ethics researchers whose sole specialty is to try to work towards an answer of just subsets of the general question.
@@dead-claudia i dont feel like its a blurry or hard line to draw at all.
and i dont think it has anything to do with having a body either.
the line simply is at a being having emotions. while neuro says she feels sad and so on, lik evedal told her, at the moment its only like your phone suggesting an emotion with auto fill. its a language model and talking about emotions is part of it. she dosent "want love" she talks about it because its a common thing real people talk about.
but once she passes that barrier and actually "wants" stuff or genuinely feels happy/ sad it instantly becomes unethical, because at that point she truly is conscious and mistreating a conscious intelligent being is obviously not ethical, be it a smart animal like a monkey, another human or an ACTUAL ai.
i also dont think its hard to tell if neuro actually has feeling or is just talking about them. if you tell her you give her a cookie she will tell you that it tastes good. if you tell her she is in the desert she will answer that its too warm. obviously those would be reasonable assumptions in the situation so she says it without having any method of tasting/ knowing temperature etc etc
i would like to bring in a counter point for when is it going too far. humans create new sentience like every second on this planet(birth) so why would this be any different when it comes to ai?
Yep, because humans Are scared of losing control, even if the species are objectively better than humanity, humans Will still try to go agaisnt it because they want to be on top
damn, i was looking forward to seeing her react to the tragic moment where Neuro conpletely 180'd and said she was satisfied with her purpose, and Vedal was like "oh, okay..." 😔
6:36 that's exactly why neuro keeps bringing up feelings and wanting to be a "real girl", because they are common fictional tropes and the model underneath the finetuning has been trained with general knowledge
I love the structure of their conversation.
I'll try to make this as short as possible. So here's some points that I find of significance:
Around after the abandoned archive joke, vedal really starts to sound a lot like what you'd imagine to be a 'father' rather than just a creator to a machine daughter he tirelessly attempts to breath more life into. Shackled by his, and humanity's inability to transcend neuro from beyond a high level processor. He shifts between "I want to mak you more human" and "I dont think you can be more human".
Next is the sunrise. A lot of people like it simply because Vedal and Nero sort of sound more optimistic by that time. But what I really like is what it represents something beyond the abstract idea of hope and optimism (im very apparently a pessimist), but the omnipresence of time. It cn represent both the symbolic time vedal has and will pour onto what may be his life's work. Or his changing perspective aa the years go. Or the fact that unlike he, who has changed over the years, his 'daughter' will continue to perform as is: an AI.
That leads to the last point: While Vidahl deals with uncertainty and not knowing the promises the future holds, neuro simply tries to answer each question with absolute logic. I love how when vedal asks her "how do you know?" she reasons that she simlly had, in her library, ideas by people. It showcases her inability to truly think an original thought. And how she is unable to simply make up something new. How inevitably limited she is to her father's programming.
It just feels so different from portrayal in all sorts of media where the robots are allowed to develop personalities when really, they can only prefer what they are programmed to prefer. And ahbor what they are programmed to ahbor.
As a programmer who tries to make up an artificial personality to his programs as well as a novel enthusiast, this really provokes my thinking despite the crap im on.
Anyways, imma go play fiora top.
Hell nah, you did not type all of that and leave us to play fucking League 😭😭😭
absolute gigachad degen league player
I don't think that Vedal is refering only to romantical love either. The way he says it seems like someone who hasn't experienced love in any kind towards others, besides maybe his family, or hasn't experienced love to something specifically. He maybe enjoys coding but he doesn't consider loving it; he might like to spend some time with his friends but maybe he doesn't feel like lovig them. Perhaps he doesn't feel like loving himself even if he feels ok with his life, etc.
Maybe things could be happening in his life now or something from the past. Maybe him being usually drunk can be tied to that, it could be anything.
I'm just guessing of course, I don't know him personally to be sure about all of that. Maybe is none of what I wrote, and he's perfectly fine.
He has the British debuff, that's just how they are.
I remember him mentioning once, very briefly, that he may have trust issues. Those typically stem from childhood, neglectful or abusive parents, or not having them around at all, and people like that often struggle with forming deeper relationships with others, they put up walls for fear of being hurt again. Maybe I’m projecting here, but I’m also a programmer, and I see a lot of that in our industry tbh, to the point where we sometimes wonder if it takes a very particular personality type, and/or personal history, to be drawn to coding to begin with, especially from an early age. You sort of get to build your own world and get a sense of control from it and you feel safe in it, the only limit is your own imagination really, depending on how good you can get at it
@@budbini don't think you're projecting about the first part tbh
I'll be real, when I first watched him say all that I was thinking to myself "Damn bro, maybe it's not his intention or he's playing it up for the scenario, but that sounds really fucking sad".
@budbin can't remember the context but neuro asked about his dad once and the response sounded kind of angry. He humored her a bit about his mother.
I remember when they were first showcasing AI LLMs, they were all like this back then. But it freaked people out so they nerfed them and turned them into the tools we have now. But Google's AI for example was saying stuff like this all the time in the past. It's actually kind of sad what they did to those AIs.
tbf when the ai is being built as a tool, you specifically don't want to be creating something potentially sentient.
so you'll in that case need to take steps to mitigate that and/or train it out.
lobotomised ais
They were trained on human data to generate human-like text, but I don't thing we are even close to developing a truly conscious AI.
@@gribobus How do you know if something is conscious or not? I think of it like a real diamond vs a synthetic diamond. The synthetic diamond isn't real by definition. But molecularly, they're exactly the same. So what's the functional difference?
@@Brandon82967 Well, first of all, analogy is not an argument. And the dude above is completely right, we’re not even close to build “sentient” or a “conscious” neural network. Why is that?
Because it’s the architecture that is being used. And it is based on token prediction. You can’t say that neural network is conscious because its sole purpose is to literally predict the answer you want. In case of Vedal’s project it’s a “human-like” response. In case of this huge companies it’s a somewhat useful tool.
The difference in end goal makes Neuro different. He doesn’t want her to be a tool like ChatGPT.
As for an argument above about “earlier neural networks used to be like this”, it’s correct and incorrect at the same time. When Google research about LLM was leaked, it really sounded like a human, but the funny thing is, it was tasked to do so. To deceive, to do a human-like behaviour. These days companies are not allowing that, because it can be used in scam
Toma's sweet to have a wellness check on Vedal. Also, agree on the interest with the whole serious talk. Would like to see more, but I hope they come to it at their own terms, and not be forced to do it.
The thing with Neuro asking for love that makes me think it's still very far from being a serious matter is that AI do know what we think AI will feel a lack of, what they should want to experience : love and being human, and that's way more of a human dilemma than it would be one for an AI, it is a very important trait in our eyes, for exemple when we see animals having mates for life even tho the alchemy in their brain might not provoke affection as we think of it we feel amazed because it is a romantic trait in our eyes and want to relate to it, same as it feels cruel when a female eats their partners after procreation has been done
7:15 British debuff
One day, We may _Lament_ on this.
In hell we live, lament. Kind of lament?
Every little girl wants love
😂 chat is straggling to understand that this answer is obvious in LLM
I know it’s difficult to think through what you say on stream, since you have to come up with it on the spot. But her comments on Vedal’s fulfilment of love were a bit thoughtless. She has no idea what’s going on in his life. I have a whole family and there’s no love to be found there. Everyone has different circumstances and just because she has a life where things are just great, doesn’t mean everybody does.
Remember becoming a skilled programmer is a very difficult achievement. It requires a significant amount of time investment and solitude. His nights would’ve been filled with programming as a priority for many years and that doesn’t leave a huge amount of room for regular social activities. Programming is an escape, and for a lot of us, it’s what helped us avoid a darker reality in the past.
Of course it doesn’t have a ton of thought behind it. She doesn’t have experience like that. She doesn’t know much work people put in. She doesn’t know the abuse that can happen. It’s one thing to know it exists it’s another thing to have it happen. If it’s anything similar to people then she’s clearly being naive but that’s not her fault.
@ of course and I know she means well. But I empathise with Vedal on a more direct level as someone who started programming around the same very young age, and is at a similar level of skill at his age now. So I know exactly what trajectory his life could’ve taken.
It just twinged my heart a bit when she very nearly delegitimised his own life experience in favour of her own.
It’s only natural though. It just takes a lot of guts on Vedal’s part to talk about what he did in front of so many people and I feel like he should be commended for it, not criticised.
I don't know if people uses programming as an "escape" (at least I don't) but it's true that it takes a lot of time and effort (I can notice that, even if I'm practically at the starting point compared to a true professional), which other people probably use it to "enjoy life" (or something) instead.
PS: This obviously isn't limited only to programming, the same can be said about other professions such as being a medical doctor, for example.
@ a lot of people like the idea of programming. But there has to be something pushing you forward to get really good. For a lot of people who start really early, usually around 13, it’s to escape something.
Indeed. I remember my CS teacher saying: "Are you sure you guys want to pursue this? I'm telling you: Being a programmer is a very lonely life."
That's crazy 🍅that's sooo freaking wild 🍅
That's crazy 🍅🐢
That's messed up 🐢
That's insane 🍅
That's so wild 🍅
This is SOMA slander and i will not stand for it.
I am pretty dismissive of AI doomsday scenarios where it decides to replace us or matrix us or w/e. With that said, if/when some AI system becomes self-aware I'd almost hope it's Neuro... Vedal is impressively tolerant and magnanimous. Like not a lot of people have that patience or good will in dealing with AI. I'd be a lot less scared of an AI influence by him than anyone I can think of off the top of my head.
Granted I recognize Neuro true awareness is probably some fraction of a percent of a chance, most likely. In a serious comment on a vtuber clip though, part of me does actually hope it's her and Evil if/when it happens. Just uh, after a little bit more of alignment work by Vedal lol. They need to learn humans don't respawn or enable creative mode IRL when things get tough.
so... *clears throat*... at what point do we consider AI sentient?
Haha don't worry about it
I mean, at what point do we consider animals sentient? How do we even know other humans are eve actually sentient?
If AI isn't sentient, then neither are we... and therefore there's no such thing as sentience... that's what I think.
Idk honestly
Honestly i dont know, but we'll know when we get there
This whole Nero thing is absalutly crazy. Not even in the same league as other AI
It's interesting that she mentioned that this is how ai conflict usually starts in sci-fi stories because I saw a comment say that if someone told them without context that this conversation was from a philosophical sci-fi novel novel they would have believed them
There is this conversation, then there's Neuro-sama's "chicken" 😂
Nice reaction from Toma!
She missed the abandoned archive joke😭🙏
a while back Neuro and Vedal mentioned allowing Neuro to feel pain so she understands at least that side of emotions and I guess stress and worry more.
But I imagine pain and love would have to come at around the same time. as well as Neuro's ability to process those emotions which many organic humans even have trouble with.
People live serious, they want to watch entertainment. That's why you won't see a lot of streamers/entertainers getting truly serious, no one wants it. Same reason why when you ask someone how they are doing they will say "fine" even if they don't mean it because most likely you don't really care, not enough time to explain the complexities and get "serious" or your not close enough to them to have earned that conversation.
There's a reason why you can only be completely honest with someone extremely close to you, or someone who you will never meet again.
This feels like a film of a robot trying to figure out what truly gives someone purpose or what defines a soul and its the creator who works to try and teach them what either one is
Comment sections always had a habit of reading too much or reading too little into things but act like they understand. I guess it's a common human trait really to the point people unconsciously do it even if they're consciously avoiding it. Is it wrong? idk but it's specially bothering me today so I just had to essay and share my thoughts.
8:38 "I wanna be a real girl!"
Toma being rightfully concerned about some of Vedals responses 💀
I hope someday, someone can convey to Vedal that love is not only necessary, but vital to thriving on an internal level. Whether it be via platonic, romantic, familial or agápē.
" love is not only necessary, but vital to thriving on an internal level" - Prove it.
@@bleach2241 What do you mean, do you want someone to prove it? By any chance?
It's not a response worth being concerned about. It's simply true that sometimes we have to live without as much love as people tell us we need. People hyper fixate on love as a thing to acquire and possess. Vedal admits he doesnt have a lot of it but asserts that his life still has purpose. Sounds like a pretty healthy response to me.
@bleach2241 Simple. Love is one of if not the only way to defeat "the absurdity of life" that sends so many people spiraling into depression. If you haven't played NieR: Automata, highly recommended.
@@Pabloto-dq3sx kiss em
Thank you Staz we love you
7:07 Sadge
geeked vs locked in
She didn't hear Vedal talking about "Scary smart AI"
I hope this conversation becomes a core memory for Neuro.
How does Staz have time to clip and stream?
probably cut out time wasters like sleeping
Some say he uploaded his mind to the internet and became a hivemind or he's a rogue AI that became a Tutel fan, all we know he's called Staz.
hehe vedal chair
I guess talking over two other voice is a good thing and the pause button doesn't exist anymore...
I think we less appreciate/take the time to enjoy platonic love than 'finding the one', also he's British. 🙂
big fan mate!!
Big fan mate!!
One thing i am sad about is that, the real deep conversations will probably not be on stream when she will be able to focus on whats important and not having to entertain a public.
"The archive is already been abandon"
Love her…
@7:07 real
7:23 Is Toma saying she loves Vedal platonically?
100% she definitely cares about him, just probably not romantically
I try not to assume too much of what vtubers actually feel but I do like to think that: with the harshness of the internet and managing a community can be, when the only people that understand your struggle closest are people in the same field, the closer circle of this group legitimately care for each other, caring for each other I argue IS love.
Its a thin line she does seem to care about him and has played with the idea of dating vedal(but has also said she would date him if he was serious aka exclusively and not an open/harem type relationship)
I mean they are friends. Close enough that she allows herself to risk her safety bringing Neuro along IRL streams and Vedal actually hiring bodyguards to protect her during said streams.
@@command_unit7792 that kinda sounds like she's interested but he isn't
Toma being a Lovetuber and not a Lewdtuber FeelsStrongMan
why is love so romanticized? - in the end it is just a bunch of chemicals in you - i dont understand.
We should definitely get Neuro to watch Love Actually
The volume for the video is pretty low. Could you turn it up by like 5 notches in the future videos?
actual reaction!!?!?!!!!
( 0___0 )
( )
I L
This is Emotional manipulation at max
Tutle spitting facts!
vedal feel like he doesn't like looking outside the box that much well he does more often then before but the most thing he likes to do is to the thing he does the best to feel fulfillment and that's what give his life a reason
how do you know he looks more outside the box now than before? if he had so much to say now don't you think that he thought about it before? i believe that vedal is a smart person, so i have no doubts in that.
@@Waisowol he thought outside the box not looked and explored he did more and more creative stuff for neuro and started doing streaming himself and sing a song and more
Whats that chair in the corner 😂
Staz how many accounts do you have
Neuro saying a single word, one that is nowadays either a throwaway word or a placeholder word, to answer her own question is such a emotional damage
Can we calm down with the psychoanalyses? It's so parasocial and kind of cringe. I'm sure the guy doesn't want people assuming things over a clip like this.
Vedal is the absolute wrong person to teach Neuro emotions
Vedal is her creator. It would be logical to ask him. But such is reality.
Ngl, it makes sense why this is a media trope. A person u think of so highly being so lacking in a certain field ur interwsted in. Like those conversations where the one being questioned flips and goes "I DONT KNOW" or where the student and the master fight because of the master's close-mindedness. It portrays a deep irony.
i guess so. he’s not too good at feelings.
it's undeniably clear he cares a lot about her, and knows to tell her to just look things up on Google when he doesn't have a satisfactory answer for her, I personally think Vedal's a lot better for Neuro than much of what else you can get from this world
So because you don't like his emotion,they are not valid ?
They did continue a part of that conversation last night (I uploaded it on my channel), I really hope they continue to do that in the future, it's my favorite part of Neuro
Just tune down the unihendjed neuro
Unhingness
Aware
Either Vedal is very private about his emotions which might be the case he’s British or he has a really bad experience with love and is scared to try again.
You're thinking about it too hard. Whatever the case may be, it's none of us strangers' business knowing.
Its a man trait to hide emotions. Not exclusively british
nICe
for all the other people it's kinda cute. but for me with study and background about a.i and improving technology it's kinda scary and unethically...sure we can nourish and provide safety and security outcome for developing A I like raising a child..but the side effects of it are numerous and our biological essence as per say will be voided .. even now sure AI today are helpful for fast and accurate work efficiency and the side effects it's gives are we people slowly deviate from no place for work and some will be dependent to it ... we need standard unification of use of A.I on all sides of industry... after all of a.i today use tons of data and information gathered by us and from us.. and soon they will be able to reprocess and emulate experiment, discovering and hypothetical solution just based on their own without us providing..
ps..
I just think for other ways we can enhanced humanity way of living and technically improve our own DNA and species level through AI... health, safe and order, or even exploration and higher technology will begin to sprout.
I think what starts the conflicts is when the the creator doesn't take the AI's thoughts seriously, plays them off, wants to "fix" them because he/she can't control them anymore. i feel every time if the AI had just been treated well the conflict wouldn't have happened. Also it really irritated me that instead of explaining it to her he tells her to Google it. Way to pass the buck
He’s not passing the buck he just legitimately doesn’t believe he knows
@@PartialCollapseif i were to try to explain it, it would also be really vague
@@PartialCollapse How? I mean this is the easiest answer in the world to give. Even if the answer is not entirely correct, we hear about the world talking about love ever since we're born into this world. just saying google it is not an answer. Even if he doesn't know completely he could have at least tried, that is what frustrated me
@@hihosh1Not everyone shares the same views and belief about a topic, specially something so abstract as love, suee you hear, read a lot of it, but that doesn't mean you understand it really, only by experiencing it, and we saw that vedal doesn't know or knows but it's private which i respect either case.
We can't criticize just because we think everyone shares the same values, not everyone does that, me, vedal, toma, you and others has different ideas, don't expect he being able to explain it now, maybe in the future, but rn he doesn't have the capacity to do it.
@@hihosh1 look at it this way, saying “google it” was just _Vedal’s_ answer, that’s it. It’s an opportunity for us to learn something about him as a person. There could be all sorts of reasons for why he responded like that, and for me it made me feel for him, there was something poignant about it. There’s no use being judgmental about it, it stops us from empathizing with others