watched some of it, I disagree with your premise of binary framing of we either can or can't. alignment is probabilistic. There's certainly a real chance it might destroy all of humanity, the question is one of risk tolerance. for some 20% is perfectly fine, others freak at 2%. The real problem is that there's not meaningful way to quantify any of it, giving way to endless discussions, most of which are useless
Great points!
🙏🙏
They would just say that the risk is potentially humanity ending (too big), not like example you gave or most risks taken in human history
high risk, high reward. Comes down to risk tolerance i suppose
It sounds like you're making the same arguments Yann LeCun used. I talked about those here: ua-cam.com/video/BygErhGbONA/v-deo.html
watched some of it, I disagree with your premise of binary framing of we either can or can't.
alignment is probabilistic.
There's certainly a real chance it might destroy all of humanity, the question is one of risk tolerance.
for some 20% is perfectly fine, others freak at 2%.
The real problem is that there's not meaningful way to quantify any of it, giving way to endless discussions, most of which are useless