If Anyone Builds It, Everyone Dies: The Case Against Superintelligent AI
by Eliezer Yudkowsky
ISBN: 9781847928931
Published: 2025-09-18
AI is the greatest threat to our existence that we have ever faced. The founder of the field of AI risk, Eliezer Yudkowsy, and his successor at the Machine Intelligence Research Institute, Nate Soares, explain why superintelligent AI is a global suicide bomb and call for an immediate halt to its development. The technology may be complex but the facts are simple. We are currently on a path to build superintelligent AI. When we do, it will be vastly more powerful than us. Whether it 'thinks' or 'feels' is irrelevant, but it will have objectives and they will be completely different from ours. And regardless of how we train it, even the slightest deviation from human goals will be catastrophic for our species - meaning extinction. Precisely how this happens is unknowable, but we what do know is that when it happens, it will happen incredibly fast, and however it happens all paths lead to the same conclusion: superintelligent AI is a global suicide bomb, the labs who are developing it...
Compare Prices
-
O'MahonysPre-OrderChecked: 24 days ago€21.25+ €4.99 shipping (Free over €30)€26.24
-
EasonsOut of StockChecked: 1 day ago€24.65+ €0.00 shipping (Free over €10)€24.65
You Might Also Like
The Art of Warhammer Video Games
Andy Hall
This is For Everyone
Tim Berners-Lee
Nexus: A Brief History of Information Networks from the Stone Age to AI
Yuval Noah Harari
As Leeds Go Marching On
Jonny Cooper
Benny the Blue Whale
Andy Stanton
Outrage Machine
Tobias Rose-Stockwell
Genesis: Artificial Intelligence, Hope, and the Human Spirit
Schmidt, Eric, III
Playing With Reality
Kelly Clancy