Kurzweil: no prophet, but a name worth dropping

For most of my undergraduate career, I wouldn't have been able to write a blog like this one. Without being thoroughly familiar with such keywords as "technological acceleration", "transhumanism" and "singularitarian", it was hard to articulate my visions and sense of purpose in life, and hard for Google to find me any of the relevant writing on the subject by futurists.

What finally got me going in the right direction was when one of my profs name-dropped Ray Kurzweil this summer, and I started reading his 2005 book The Singularity Is Near: When Humans Transcend Biology.

I'd heard of a technological singularity before, but I'd dismissed it as an overly vague notion, probably because I'd relied on the sort of muddied definitions that tend to arise when people don't distinguish the three versions of the scenario. As Yudkowsky notes, confusion between these three versions tends to distort even their shared elements and create apparent inconsistencies.

I'd been interested in the concept of mind uploading, and Oxford philosopher Nick Bostrom's paper on the simulation hypothesis had introduced me to the idea of substrate-independence. I'd been aware that digitizing a human brain at the level of individual neurons and synapses would be hard, and had considered the idea that nanotechnology might help. Sooner or later, I hoped to see an expert opinion from the software-engineering field on whether or not it was doable this century. But I wasn't aware that Kurzweil had already given one.

I'd also long suspected that scientific discoveries and technological innovations might have an autocatalytic effect. They'd sometimes make it possible to invent new lab equipment or new engineering tools, thus increasing the pace and scope of subsequent research. But I mentally expressed it in Google-unfriendly terms of "autocatalysis" rather than "acceleration", and didn't know where to look for empirical evidence of it. I didn't appreciate the extent of automation in the semiconductor industry, so when profs said Moore's Law was probably about economics more than it was about technology, I believed them.

When I heard the name Ray Kurzweil, I finally had a jumping-off point. A little Googling led me to his book, which not only tied the mind-uploading idea and the acceleration idea, but also showed me where they could lead and what was technologically feasible within my lifetime. Kurzweil demonstrated that his exponential-growth models fit the historical data, and that a wide range of experimental and theoretical technologies justified extrapolating them. As a computer scientist, he also avoided two common mistakes: confusing hardware with software when discussing the human mind, and blindly assuming that building the digital copy would be easier than running it.

Kurzweil's work isn't the be-all and end-all of Singularitarianism, and wouldn't be even if it were being updated regularly (one of my biggest complaints about it). Like George Dvorsky, I'm concerned about Kurzweil's cult of personality. I'm afraid his handwaving of social, political and economic factors in his forecasts, not to mention his promotion of highly-experimental health care, may be making him a straw man for anti-Singularitarians.

Still, The Singularity Is Near gave me what I needed when I read it: some cause for optimism, an ultimate purpose in life, clarification and consolidation of several nebulous concepts in my mind, and a starting point for further research. Instead of calling myself "an agnostic who believes in blah blah blah, fears blah blah blah and hopes for blah blah blah", I can now say, "I'm a Singularitarian and Transhumanist." It condenses my most important beliefs almost as efficiently as saying "Alice is a Secular Humanist" or "Bob's a conservative Catholic."

Ray Kurzweil may not be the prophet he's made out to be, but his name is worth dropping whenever you want to introduce anyone else, especially another computer programming expert, to the concept of Singularitarianism.