Catchy AGI podcast title: “We’re All Gonna Die.” But Eliezer Yudkowsky sincerely means it. Time to panic? Well, AGI doomers have been under fire lately. Fairly? First, let’s get to the heart of Eliezer's worst-case scenario. I’ve made the case a little more visual:
5
12
55
54K
40
Download Video
Then @ESYudkowsky adds: “That's the disaster scenario if it's as smart as I am. If it's smarter, it might think of a better way to do things. But it can at least think of that if it's relatively efficient compared to humanity. Because I'm in humanity and I thought of it.”
@bakerlink I still don’t understand what would be the incentive for AI to exterminate the human race