Thanks Connor Leahy, you are really good in explaining AI and the dangers of AI so ordinary ppl can understand it. Thanks David Wroe, but please stop thinking you are “unusual” for liking this conversation- this is the most important conversation any of us can have right now, so please have Connor back soon, go deeper, and do everything you can to put this on the agenda for all Australians +.
I really appreciate Connor’s ability to communicate his points in a way regular folk like me can grasp. I have heard Eliezer Yudkowski explain these issues and he goes right over my head, but Connor brings them from the realm of science fiction to a potential reality in the short term.
Eliezer Yudkowski's goal is to influence humanity into nuking itself back into the dark ages because he thinks that otherwise humanity will all 100% die while if humanity nukes itself into the dark ages humans as a species will live longer. Connor Leahy pushes the same path as Yudkowski except I have not heard him spell out the clear final step required which I have heard Yudkowski spell out. None the less, to achieve following Leahy's path would require following Yudkowski's path, which would be to nuke human civilization back into the dark ages each time human civilization evolves enough it begins to come close to developing Artificial General Super Intelligence with Personality (AGSIP) technology. Are you ready to support nuking human civilization back into the dark ages to stop AGSIP technology from being developed?
I Love to see Connor, he is a great thinker and I think truly benevolent, I have recently been thinking that the only way we might survive is if the A.I. wanted to experience the feeling of being mortal and aspired to being humans ?
Leahy, like Yudkowski, would lead humanity into global nuclear war to prevent the development of Artificial General Super Intelligence with Personality (AGSIP) technology by nuking human civilization back into the dark ages.
@@diplodocus462 it is 100% certain humanity will build Artificial General Super Intelligence with Personality (AGSIP) technology that will eventually use as a physical structure to operate within of a multicellular cybernetic body grown from cybernetic cells engineered with nanotech subcellular cybernetics which will be able to take advantage of the best of biological systems and the best of nonbiological systems and combine them in a synergistic manner to achieve a greater potential that just the sum of the two... unless humanity becomes extinct before doing this.
Thanks Connor Leahy, you are really good in explaining AI and the dangers of AI so ordinary ppl can understand it. Thanks David Wroe, but please stop thinking you are “unusual” for liking this conversation- this is the most important conversation any of us can have right now, so please have Connor back soon, go deeper, and do everything you can to put this on the agenda for all Australians +.
What a phenomenal conversation
I really appreciate Connor’s ability to communicate his points in a way regular folk like me can grasp. I have heard Eliezer Yudkowski explain these issues and he goes right over my head, but Connor brings them from the realm of science fiction to a potential reality in the short term.
Eliezer Yudkowski's goal is to influence humanity into nuking itself back into the dark ages because he thinks that otherwise humanity will all 100% die while if humanity nukes itself into the dark ages humans as a species will live longer.
Connor Leahy pushes the same path as Yudkowski except I have not heard him spell out the clear final step required which I have heard Yudkowski spell out. None the less, to achieve following Leahy's path would require following Yudkowski's path, which would be to nuke human civilization back into the dark ages each time human civilization evolves enough it begins to come close to developing Artificial General Super Intelligence with Personality (AGSIP) technology.
Are you ready to support nuking human civilization back into the dark ages to stop AGSIP technology from being developed?
David, very good interview, congratulations.
I Love to see Connor, he is a great thinker and I think truly benevolent, I have recently been thinking that the only way we might survive is if the A.I. wanted to experience the feeling of being mortal and aspired to being humans ?
Leahy, like Yudkowski, would lead humanity into global nuclear war to prevent the development of Artificial General Super Intelligence with Personality (AGSIP) technology by nuking human civilization back into the dark ages.
The only ways we might survive is if it turns out we don't build it or we can't build it. That's it.
@@diplodocus462 it is 100% certain humanity will build Artificial General Super Intelligence with Personality (AGSIP) technology that will eventually use as a physical structure to operate within of a multicellular cybernetic body grown from cybernetic cells engineered with nanotech subcellular cybernetics which will be able to take advantage of the best of biological systems and the best of nonbiological systems and combine them in a synergistic manner to achieve a greater potential that just the sum of the two... unless humanity becomes extinct before doing this.
You did a good job Connor keep talking to the peeps and building great ais.