Eliezer Yudkowsky of the Machine Intelligence Research Institute, did not sign the document of computer technocrats calling for a pause in the advancement of general AI. Instead, he believes that there should be a total shutdown of AI more powerful that GPT-4 as there is the Terminator possibility that it could evolve to kill “every single member of the human species and al biological life on Earth.” He said: “To visualize a hostile superhuman AI, don’t imagine a lifeless book-smart thinker dwelling inside the internet and sending ill-intentioned emails. Visualize an entire alien civilization, thinking at millions of times human speeds, initially confined to computers—in a world of creatures that are, from its perspective, very stupid and very slow. A sufficiently intelligent AI won’t stay confined to computers for long. In today’s world you can email DNA strings to laboratories that will produce proteins on demand, allowing an AI initially confined to the internet to build artificial life forms or bootstrap straight to post-biological molecular manufacturing.
If somebody builds a too-powerful AI, under present conditions, I expect that every single member of the human species and all biological life on Earth dies shortly thereafter.”