Asset Details
MbrlCatalogueTitleDetail
Do you wish to reserve the book?
If anyone builds it, everyone dies : why superhuman AI would kill us all
by
Yudkowsky, Eliezer, 1979- author
, Soares, Nate, author
in
Artificial intelligence.
/ Artificial intelligence Social aspects.
/ Human beings Extinction.
/ artificial intelligence.
2025
Hey, we have placed the reservation for you!
By the way, why not check out events that you can attend while you pick your title.
You are currently in the queue to collect this book. You will be notified once it is your turn to collect the book.
Oops! Something went wrong.
Looks like we were not able to place the reservation. Kindly try again later.
Are you sure you want to remove the book from the shelf?
Oops! Something went wrong.
While trying to remove the title from your shelf something went wrong :( Kindly try again later!
Do you wish to request the book?
If anyone builds it, everyone dies : why superhuman AI would kill us all
by
Yudkowsky, Eliezer, 1979- author
, Soares, Nate, author
in
Artificial intelligence.
/ Artificial intelligence Social aspects.
/ Human beings Extinction.
/ artificial intelligence.
2025
Please be aware that the book you have requested cannot be checked out. If you would like to checkout this book, you can reserve another copy
We have requested the book for you!
Your request is successful and it will be processed during the Library working hours. Please check the status of your request in My Requests.
Oops! Something went wrong.
Looks like we were not able to place your request. Kindly try again later.
If anyone builds it, everyone dies : why superhuman AI would kill us all
BOOK
If anyone builds it, everyone dies : why superhuman AI would kill us all
2025
Available to read in the library!
Request Book From Autostore
and Choose the Collection Method
Overview
\"In 2023, hundreds of AI luminaries signed an open letter warning that artificial intelligence poses a serious risk of human extinction. Since then, the AI race has only intensified. Companies and countries are rushing to build machines that will be smarter than any person. And the world is devastatingly unprepared for what would come next. For decades, two signatories of that letter -- Eliezer Yudkowsky and Nate Soares -- have studied how smarter-than-human intelligences will think, behave, and pursue their objectives. Their research says that sufficiently smart AIs will develop goals of their own that put them in conflict with us -- and that if it comes to conflict, an artificial superintelligence would crush us. The contest wouldn't even be close. How could a machine superintelligence wipe out our entire species? Why would it want to? Would it want anything at all? In this urgent book, Yudkowsky and Soares walk through the theory and the evidence, present one possible extinction scenario, and explain what it would take for humanity to survive. The world is racing to build something truly new under the sun. And if anyone builds it, everyone dies.\" -- Provided by publisher.
Publisher
Little, Brown and Company
Subject
ISBN
9780316595643, 0316595640
Item info:
1
item available
1
item total in all locations
| Call Number | Copies | Material | Location |
|---|---|---|---|
| Q335 .Y83 2025 | 1 | BOOK | AUTOSTORE |
This website uses cookies to ensure you get the best experience on our website.