"If somebody builds a too-powerful AI, under present conditions, I expect that every single member of the human species and all biological life on Earth dies shortly thereafter." Must-read by @ESYudkowsky. This is why I prefer to call it "Inhuman Intelligence."
"If we held anything in the nascent field of Artificial General Intelligence to the lesser standards of engineering rigor that apply to a bridge meant to carry a couple of thousand cars, the entire field would be shut down tomorrow." time.com/6266923/ai-eli…
"We are not ready. We are not on track to be significantly readier in the foreseeable future. If we go ahead on this everyone will die, including children who did not choose this and did not do anything wrong. Shut it down." @ESYudkowsky time.com/6266923/ai-eli…
@nfergus Could they build a robot with ai? I know u gotta have a huge data center but I can see it the way we are moving.
@nfergus How is AGI meant to kill everything on living earth? Aren't they just incredibly good at strategy games with set rules and searching the internet incredibly quickly?
@nfergus Very compelling read although the assumption is that AI will kill us all. I'd like to read support for this theory. It seems to hinge on the model of scarcity that pervades all aspects of our current life. In the absence of scarcity what is the incentive for AI to kill?