Silicon Insider: Ray Kurzweil's Vision of Forever

Sept. 29, 2005 — -- Is there no change of death in paradise?

Does ripe fruit never fall? Or do the boughs

Hang always heavy in that perfect sky.

… Death is the mother of beauty, mystical

Within whose burning bosom we devise

Our earthly mothers waiting, sleeplessy.

-- Wallace Stevens ("Sunday Morning")

If you haven't heard of Ray Kurzweil, don't worry, you soon will. His new book, "The Singularity is Near," is likely to become the most discussed science book this year -- and some pundits are already suggesting that every responsible person should at least know what Kurzweil means by "singularity," as well as the underlying themes of the book.

In case you have a cocktail party tonight, or a meeting with the board of directors this afternoon, let me give you a quick summary of Kurzweil's theory:

The term "singularity" is one you may have heard before, especially if you are a fan of cosmology. A singularity is an event so profound that little information survives it -- the ultimate singularity being the creation of the universe, during which (apparently thanks to some kind of quantum twitch) everything suddenly emerges from nothing. The best known form of singularity occurs when a star, after exploding, collapses in on itself -- until it reaches a point of infinite density and zero volume, and forms a "black hole" that ruptures space and time and exhibits so much gravity that not even light can escape its embrace.

The first use of "singularity" in regard to the electronics revolution comes from, of all people, that founding father of computing (and game theory, and a lot of other stuff that defines modern life), John von Neumann. In the 1950s, von Neumann apparently said, with his usual superhuman prescience, that "the ever-accelerating progress of technology… gives the appearance of approaching some essential singularity in the history of the race beyond which human affairs, as we know them, cannot continue."

Coming Sooner Than You Think

Basically, all you really need to know about Kurzweil's book is that he not only believes that von Neumann is right, but that this singularity in human affairs is less than 50 years away. And he, for one, intends to be there for this Digital Rapture -- even now, at age 56, he gulps 250 pills per day and takes even more nutrients intravenously each week, in the hope of preserving himself long enough to cross over, at the singularity, into immortality.

What will this singularity look like? For Kurzweil, it will be a "period during which the pace of technological change will be so rapid, its impact so deep, that human life will be irreversibly transformed." And by transformed, he means that mankind will move on to the next evolutionary step, in which our intelligence will rapidly grow to fill the entire universe. Think the Star Child in "2001: A Space Odyssey," except with HAL as its body. In becoming one with the stars, we will first become one with our machines.

Why will this singularity occur so soon? Because so much of the scientific world has been moving forward at an exponential rate. The best known of this progressions is, of course, Moore's Law which claims that the performance of integrated circuit chips -- measured in speed, size, performance or price reduction -- will double about every 24 months. And we have seen the extraordinary gains in chip performance over the last 40 years, whereby one modern Intel microprocessor is thousands of times as powerful as its model 8086 precursor was just three decades ago. No invention in human history has evolved so fast.

And it isn't just happening in chips. Today the blazing pace of Moore's Law not only defines chips, but disk memory, fiber optic bandwidth, computer processing speeds, nanotechnology, even bioinformatics (remember how quickly the Human Genome was mapped, despite estimates that it might take almost a century?) Increasingly, every corner of our lives has begun to accelerate at this extraordinary rate.

And "extraordinary" doesn't begin to cover it. If you know anything about mathematics, what is going on should make your hair stand on end. As with the Chinese fable about grains of rice on a chessboard, exponential growth rates have the curious characteristic of starting out deceptively slow, then, in a curve likened to a hockey stick, getting real big, real fast. What seems like gentle, if slightly accelerated progress, suddenly blasts off towards infinity.

Kurzweil, like many others, has looked at all of these parabolic curves and concluded that humanity is now right at the brink of that take-off. That will be humanity's singularity, when our technology suddenly races away to rule the cosmos. And it is his conceit that we will be part of it, abandoning our fragile organic bodies, mapping every neuron in our brains and then porting ourselves over to computers, and then plugging ourselves directly into the Internet, and thus into all of the knowledge in the world. From there, our machines, improving themselves far faster than we ever could, will reach out to take command of the entire universe, turning it into a vast computer, a nearly infinite extension of ourselves.

It is a dream only a computer scientist could truly love.

Some Questions Arise

This column is hardly the place to tackle a full analysis of Kurzweil's argument, and large portions of the astrophysics are out of my range. Nevertheless, as this is likely to be a hugely controversial work, one that may even impact legislation and education in the years to come, there are some questions you might want to ponder before selling your house to buy nutrient pills or dropping out of college because you'll soon know everything anyway.

First of all, there is the matter of Moore's Law -- on which, having written about it more than just about anyone in the world, I am something of an expert. Kurzweil believes that the Law will keep working on chips -- and thus computers, the Internet, etc. -- ad (almost) infinitum. That semiconductor companies, when they reach the physical limits of the planar process on silicon, some time around 2020, will then smoothly switch to some new technology, likely at the quantum level, that will enable Moore's Law to go on until every atom, or quark, in the universe can be incorporated into the network.

That's a mighty big assumption. Even Gordon Moore doesn't believe his Law can go on forever. And though Kurzweil dismisses the notion that Moore's Law is in fact a compact between chip makers and their customers, rather than a real physical law, the former is, in fact, true. He ignores the fact that each new generation of chips drags along with it an ever-larger anchor of increasing fabrication costs, ever-greater number of lines of code, ever more consumption of power. And he doesn't seem to notice that chip companies, even Intel itself, are beginning to inch away from the grueling financial demands of the Law, and instead are building larger, cheaper multi-processor chips instead.

Then there is the suspicious matter of timing. Even if there is a singularity -- and that is a very big "if" -- it is hard to believe it could possibly happen in the next 20 or 30 years ... not when we can't even get the code right yet on Windows. So why does Kurzweil predict the singularity will arrive so soon? Nothing personal against Ray, whom I've known for years, but I'm in my fifties now, too, and I've heard my own 3 a.m. whispers of mortality. All that sad pill-popping may help Ray reach 2030, but not 2130. If he knew for certain that he was going to die, would he still have used the same dates in his book? Otherwise, how much of this is mid-life crisis disguised as hard science?

The Human Factor

Next, there is the mechanism itself. Do we need all of these scientific disciplines to race asymptotically to infinity at the same time? What if some are later than others? What if one or two drop out -- do we still get a singularity, or just an insane, out of control future? And what if Moore's Law were to slow -- as, in fact, it has done slightly over the years -- would that delay the singularity or forever defer it?

Meanwhile, assuming that everything does race ahead in concert, there is the matter of actually converting ourselves from analog to digital creatures. Kurzweil's entire theory (as first expounded in his last book) is that there is some threshold at which chip density will be so great that life -- even consciousness -- will spark in those dead silicon substrates. But so far, despite having computing power as great as some primitive annelids and insects, no computer has yet exhibited that miracle.

Conversely, he is certain that we will soon develop the tools to locate not only every neuron in a living brain, but also map the entire field of moving electrons as they race about the brain, storing memories, formulating new thoughts, giving each of us our unique personality. Kurzweil assumes that this will be a straightforward problem, a mere transporting of our selves from brain to digital network, that you will be able to lift a consciousness out of a living human being and place it into a computer without catastrophic results. But I don't believe it.

Do We Really Want Immortality?

Finally, there is what might be called Kurzweil's anthrocentrism. Crucial, it seems, to his idea of the singularity is the notion that he will be part of it -- why else try so hard to stay alive until then? He even presents in the book a little playlet, in which a person talks with an understanding, helpful computer. But if this is really the next step in evolution, name for me a species that has ever survived such a transformation. You either don't evolve, and stay essentially as vermin, or you are made extinct by your successor. What makes Kurzweil think that these infinite machines will want to take us along for the ride? Even if, as in some Asimov novel, we make these machines non-threatening to humans (assuming that, once they begin building each other, they'll retain that rule), what's to keep them from being indifferent? I have a vision of the singularity arriving, and Ray Kurzweil rushing to meet it -- only to shout out "Wel…" before he is squished like a bug.

Finally, this leads to the question of immortality, which seems Kurzweil's true intent. In my experience, smart people want to live forever, but wise people never do. Immortality, as Wallace Stevens understood 90 years ago, would be a ghastly place. Kurzweil doesn't speculate much about the world after singularity, and probably with good reason. Life without death is life without beauty. It is a place without children, and thus without hope. It is also a place where the end would come only through misadventure or suicide -- not, as Pope John Paul II showed us so brilliantly, as the natural culmination of life. But most terrifying of all, life without death would be life without even the slightest chance of redemption. Against that, who would not choose the messiness and pain and joy of mortality?

So, does that make me not, in Kurzweil's phrase, a "singularitarian?" Probably. But it doesn't mean that I, like many people in the tech world (most notably Bill Joy), don't sense -- with considerable dread -- that something big is coming? A number of years ago, in a much reprinted essay (http://www.forbes.com/asap/1998/1130/028.html) that used the bell at Mission Santa Clara as a symbol of the acceleration of daily life in Silicon Valley, I had my own premonition:

"...who among us will be able to cross over to the other side? A few will, perhaps our children and our children's children who have spent their entire lives as navigators of cyberspace.

But it is also not hard to imagine that no one, at least no one human, will enter this new world, or the next one that arrives in the final decades of the 21st century.

Who, or more accurately what, will this new era, this new timescape, belong to? Intuitively, we already know: the machines themselves. Eventually, anthropomorphic software agents will be our surrogates into this world...until they need us no more.

Then the tool will become the toolmaker, and perhaps the toolmaker the tool."

The difference between me and Ray Kurzweil is that I hope I don't live to see that day.

This work is the opinion of the columnist and in no way reflects the opinion of ABC News.

Michael S. Malone, once called "the Boswell of Silicon Valley" most recently was editor at large of Forbes ASAP magazine. He has covered Silicon Valley and high-tech for more than 20 years, beginning with the San Jose Mercury-News as the nation's first daily high-tech reporter. His articles and editorials have appeared in such publications as The Wall Street Journal, The Economist and Fortune, and for two years he was a columnist for The New York Times. He has hosted two national PBS shows: "Malone," a half-hour interview program that ran for nine years, and in 2001, a 16-part interview series called "Betting It All: The Entrepreneurs." Malone is best known as the author of a dozen books. His latest book, a collection of his best newspaper and magazine writings, is called "The Valley of Heart's Delight."