News Analysis: With futurists planning for the so-called Singularity, the question is no longer what it will be, but how we will keep it from destroying the very nature of humanity.
Two of IT's greatest strengths are the speed at which it can be adapted to
changes in the way people use it, and the way in which it can change the people
who use it.
You could say this about tools like fire, which allowed humans to survive in
previously hostile climates, to cook food instead of eating it raw, and to
create more efficient tools such as charcoal in place of wood and steel in
place of iron. But the electronic computer as a personal device has only been
with us for 30 years or so, and it's already become a ubiquitous element of our
Some thinkers believe that we're approaching the Singularity,
with machine-based intelligence taking over at the point where human
intelligence leaves off. Depending on who's talking, this is a good thing to be
embraced wholeheartedly, or an evil to be avoided at all costs. I guess it
depends on whether you listen to Ray Kurzweil or Ted Kaczynski; I don't pay much
attention to either, because I suspect that the future is going to be a mixture
of their predictions.
Part of my aversion to Singularity cheerleaders is that, as a society, we
don't really have a good definition for "intelligence." Many of us
know smart, talented people who are absolutely hopeless at everyday tasks, and
others who are utterly brilliant at the mundane, yet have no taste for what we
think of as the things smart people do, whether we're referring to language,
math or science.
As you might expect, most of us fall between those extremes. The problem
that some thinkers have with the idea of the post-human future, as the
believers in the Singularity define that, is the inevitability of an
overwhelming number of have-nots working for a handful of haves.
To me, that's pretty much the story of human civilization, whether we're
talking about Sumeria or the Holy Roman Empire, or
I don't buy the arguments of those who say the Singularity will represent an
insurmountable obstacle for much of humanity, because we've been there before.
The issue for me is not one of, "Should we allow the Singularity to be
created?" Rather, it's what we do with it that counts. It's too late to
halt the progress in technology that is leading to the Singularity; that train
left the station a long time ago. What we can do, and must do to retain the
dignity of the individual, is to make sure that the power given to the haves
Power corrupts, whether it's economic or political in form. We may learn in
our lifetimes whether technological power is capable of remaining pure, or if
it too is corrosive to those of us who wield it and to our society.
That's where my vision of tomorrow becomes apocalyptic. Human nature being
what it is, the haves of a Singularity future are unlikely to be terribly eager
to share their good fortune.
What scares me most of all, far more than machine intelligence,
nanotechnology or neural implants, is the revolt that will follow if it turns
out that the Singularity is in fact corrupt. As in any other rebellion of the
masses, many good things will be destroyed as part of the campaign to wipe out
Of all the technologies related to the Singularity, the one that I find most
troublesome is life extension. Adding a decade or two to one's lifespan is one
thing; adding a century or two could send our culture down a very ugly road. I'm
not saying we should limit ourselves to the biblical three-score years and 10,
but I suspect that if humanity finds itself with a class of immortal haves, the
mortals will decide that they have nothing to lose but their chains.
Part of the evolution of human culture is the handoff from one generation to
the next. If a future generation decides that it's not going to play by that
rule any more, we're likely to wind up with a gerontocracy that makes the
leadership of the former Soviet Union look like Ken
Kesey and the Merry Pranksters.
Fortunately, as Ashlee Vance pointed out
in a recent New York Times article,
the business leaders of today who seem most interested in the Singularity may
be more focused on how to make money from it than on ensuring their
immortality. This could turn out to be one of those cases where short-term
greed works for the long-term good.
My concerns about the Singularity have less to do with the technology that
creates it; that's just a tool. What's going to be important-and probably
critical-is how we use that technology, and how humanity reacts to it. After
all, have you seen a post-apocalyptic future that looks like fun? Me neither.