Incredible take on artificial superintelligence (or A.S.I.)
“If Anyone Builds It, Everyone Dies,” is a look at how humans probably can’t control an A.S.I. and how it would more than likely decide to exterminate our species. The book takes the path of reviewing other engineering projects that broken down and, how in this case, “humanity only gets one shot.”
This book discusses the potential catastrophic risks of developing superhuman AI and urges governments and leaders to take action to prevent such an outcome. The authors, experts in AI alignment, present a compelling argument that the creation of superintelligent AI could lead to global human annihilation unless proper precautions are taken.