Eliezer Yudkowsky, a top artificial intelligence (AI) researcher and decision theorist who advocates a cautious approach to the technology, believes that AI research and development must be shut down immediately lest it threaten all of humankind.
In an op-ed run by Time magazine titled “Pausing AI Developments Isn’t Enough. We Need to Shut it All Down,” Yudkowsky wrote that AI technology is making such rapid improvement that it will soon far outstrip even the smartest humans.
But because few precautions have been taken to ensure that AI comes with strict ethical codes, AI consciousness has no reason to care about humanity or treat it as anything other than an obstacle.
“That kind of caring is something that could in principle be imbued into an AI but we are not ready and do not currently know how,” Yudkowsky wrote.
And in the likely event of conflict, humanity’s odds of survival are close to zero. While AI might only exist in the digital world for now, it could easily take over physical infrastructure and do away with humanity in favor of its own biotechnological creations.
You are now signed up for our newsletter
Check your email to complete sign up
“Visualize an entire alien civilization, thinking at millions of times human speeds, initially confined to computers—in a world of creatures that are, from its perspective, very stupid and very slow,” Yudkowsky wrote.
The advent of programs that can manage increasingly complex activities such as hold conversations, write their own code, or produce convincing photos and even art has stirred popular debate, including about the future of jobs in a world where machines can do almost anything humans can do — and more.
Yudkowsky was one among hundreds of leading figures in the tech field who signed a March 29 open letter calling for a six-month moratorium on AI development.
He stressed that halting AI development is something all institutions and governments need to do immediately, given that “we all live or die as one” depending on the outcome of near-future AI advances, not a matter of “policy but a fact of nature.”
“We are not ready. We are not on track to be significantly readier in the foreseeable future. If we go ahead on this everyone will die, including children who did not choose this and did not do anything wrong,” Yudkowsky wrote towards the end of his editorial for Time.