■ In this week’s Backchannel: Eliezer Yudkowsky, AI’s prince of doom, explains why computers will kill us and provides an unrealistic plan to stop it. Also a look at the early days of doomerism and whether NFTs can restore their reputation by stopping AI deepfakes. |
The subtitle of the doom bible to be published by AI extinction prophets Eliezer Yudkowsky and Nate Soares later this month is “Why superhuman AI would kill us all.” But it really should be “Why superhuman AI WILL kill us all,” because even the coauthors don’t believe that the world will take the necessary measures to stop AI from eliminating all non-super humans. The book is beyond dark, reading like notes scrawled in a dimly-lit prison cell the night before a dawn execution. When I meet these self-appointed Cassandras, I ask them outright if they believe that they personally will meet their ends through some machination of superintelligence. The answers come promptly: “yeah” and “yup.”
I’m not surprised, because I’ve read the book—the title, by the way, is If Anyone Builds It, Everyone Dies. Still, it’s a jolt to hear this. It’s one thing to, say, write about cancer statistics and quite another to talk about coming to terms with a fatal diagnosis. I ask them how they think the end will come for them. Yudkowsky at first dodges the answer. “I don't spend a lot of time picturing my demise, because it doesn't seem like a helpful mental notion for dealing with the problem,” he says. Under pressure he relents. “I would guess suddenly falling over dead,” he says. “If you want a more accessible version, something about the size of a mosquito or maybe a dust mite landed on the back of my neck, and that’s that.”
The technicalities of his imagined fatal blow delivered by an AI-powered dust mite are inexplicable, and Yudowsky doesn’t think it’s worth the trouble to figure out how that would work. He probably couldn’t understand it anyway. Part of the book’s central argument is that superintelligence will come up with scientific stuff that we can’t comprehend any more than cave people could imagine microprocessors. Coauthor Soares also says he imagines the same thing will happen to him but adds that he, like Yudkowsky, doesn't spend a lot of time dwelling on the particulars of his demise. |
|
|
Subscribe today to continue receiving my full newsletter each week. Backchannel gets you access to the power players in Silicon Valley, and delivers the first take on groundbreaking startups or significant product launches from bigger tech firms.
You can expect deep dives on what’s really important in the AI world, critical examinations of tech policy, and insights about how tech is changing our lives.
Backchannel is where I present my spin—backed by fresh reporting and years of experience covering tech—on the biggest news in Silicon Valley. |
As a WIRED subscriber, you’ll get: |
|
|
|