Scarier than Skynet

Nick Bostrom's Superintelligence

Nick Bostrom’s Superintelligence: Paths, Dangers, Strategies, essentially an academic paper on the concerns around AI, the possible ways by (and speeds at) which it may lap human intelligence, and the multiple potential outcomes of those scenarios is smart, more comprehensive than you can imagine, and really fucking scary. Scary because it’s often as dry as an instruction manual. Imagine reading detailed instructions on a new device and coming across something to this effect: “Do not cross these two wires. Crossing these two wires together yields planetary-level damage. Do not cross these two wires under any circumstances.” Imagine glossing over that detail buried on page 10. Of course, a robot would never do that…

This is no pop science book. It’s a struggle to read sometimes and often slow going. Bostrom has degrees in robotics, philosophy, math, and physics. He occasionally uses a down-to-earth metaphor but as I mentioned, this is an exhaustively researched book that reads like an extremely technical and academic thesis. And, as he demonstrates, it serves no one if he dumbs this down. Here’s why: after enumerating what he believes to be an exhaustive list of edge cases related to a particular scenario in which, say, an AI might pretend to exhibit rational operations while planning a large-scale domination or species-wide genocide, he’ll often suggest that perhaps the AI may take a route that we may not have considered and cannot consider. Always, he insists, this possibility exists.

At it’s best, Superintelligence blends clinical thoroughness with jaw-droppingly evocative as in this fantastic passage:

Consider a superintelligent agent with actuators connected to a nanotech assembler. Such an agent is already powerful enough to overcome any natural obstacles to its indefinite survival. Faced with no intelligent opposition, such an agent could plot a safe course of development that would lead to its acquiring the complete inventory of technologies that would be useful to the attainment of its goals. For example, it could develop the technology to build and launch von Neumann probes, machines capable of interstellar travel that can use resources such as asteroids, planets, and stars to make copies of themselves.[emphasis mine]

Ya don’t say! Throughout the book, Bostrom remains keen to the idea that AI can reach for the resources of the known universe. This is considerably more ambitious and, well, scary than the concept of Skynet. Skynet essentially had one trump move: pit two human superpowers (the US and the USSR) against one another to facilitate mutually assured destruction. Then, after crippling the human race that tried to deactivate it, Skynet continues to defend itself…through a ground war. I admit, I’d never considered this unsophisticated second stage to be an unlikely move for a machine intelligence. If we’re giving credit to The Terminator mythos (and I’m inclined to do so) we could imagine that Skynet operates through what Bostrom calls a “stunted” AI, one that has deliberately been given limited information or a restricted flow of data from which to generate judgments. But as he explains, a proper AI likely can fill in an incomplete picture:

It might be that a superintelligence could correctly surmise a great deal from what seem, to dull-witted human minds, meager scraps of evidence. Even without any designated knowledge base at all, a sufficiently superior mind might be able to learn much by simply introspecting on the workings of its own psyche—the design choices reflected in its source code, the physical characteristics of its circuitry.

Don’t try to get the jump on a machine. As bummer as this book can be at times, it’s actually weirdly fun and a breath of fresh air on the non-fiction tip. With the sustainability of our planet’s resources and the psychotic teetering of our global economy, it’s bonkers to consider adding concerns over machine intelligence superseding our own. If it’s any conciliation, the conditions could accrete so fast, we may never even have the chance to see it coming…

Good night!