Exaptationist vs adaptationist Singularity, and engineerability

In terms of total biological/social/technological complexity, human society may still be closer to the first single-celled lifeforms that colonized the Earth 4 billion years ago than we are to a technological singularity. But in terms of the logarithm of complexity, we're nearly there -- we have most of the orders of magnitude of progress we'll need. And because positive feedback loops continue to provide exponential or faster growth, that's what matters.

The Singularitarian's burden

The other day, it occurred to me how impressive it was that so much progress had been made without anyone or anything having to concern themselves with what their own or any other species was evolving into, or how quickly.

If we'd come that far without a targeted effort, how much faster could we progress toward the Singularity if we organized for that explicit purpose? Or if we weren't in a hurry, could we reduce the risk of an unfriendly Singularity, i.e. get some "Terminator insurance"? The standard assumption among Singularitarians seems to be that switching to "engineered evolution" is desirable if not essential, now that we understand the nature and dynamics of the positive-feedback loops involved.

But the first attempt to engineer humanity's evolution in a post-Darwinian framework was eugenics, which was clearly premature given Victorian-era science, and has clearly done more harm than good. Attempts to engineer technology's evolution (MIRI, Lifeboat Foundation, etc.) haven't caused a Holocaust yet, but they've led to a scientist receiving two death threats. At best, they've been a distraction, and they haven't produced any innovations that could feed the feedback loops (such as any algorithms a programmer could actually implement, compile, run, and get useful output from). And if it's possible to engineer the Singularity, then why did Ray Kurzweil get out of a successful engineering career (while allegedly in great health) when he discovered the concept?