Monday, February 21, 2011

Twittering Tunisia, Cloning Cairo



I don't think the world has ever seen anything quite like this: whole populations massing and simultaneously rising up against their governments. It seems like every day there's a new "Facebook" revolution somewhere, with chain-reactions that resemble nothing less than the flocking behavior in birds or the schooling behavior in fish. First Tunisia, then Egypt... Bahrain... Iran... Yemen... Sudan... Jordan... Morocco...and now Libya (where, it seems, the regime will do anything to stay in power, including shooting people in cold blood with heavy-caliber weapons). All of it happening at lightning speed--faster than human brains can comfortably process. Nobody can say for sure who is in charge of the program.

Who would have thought that revolutions could self-replicate like computer viruses, leaping from mind to mind, country to country, propagating out of control? Everything now is so hyped up, it's as if the whole world is in an altered state, while long-term systems suddenly diverge into unpredictable thrashing, or simply collapse because they can no longer support themselves.

Is this what the first stage of the "Singularity" looks like, I wonder? The Singularity, by definition, is that moment "when technological change becomes so rapid and profound it represents a rupture in the fabric of human history."

I am quoting here from an article by Lev Grossman in the Feb. 21 issue of Time magazine, describing the potential transformation of our species into something no longer recognizable, which supposedly will happen when computers become much more intelligent than humans. (IBM supercomputer "Watson" already proved its superiority by winning the million-dollar prize on the nightly game-show quiz "Jeopardy" last week.) At some future point, according to Ray Kurzweil, a primary guru of the superhuman "intelligence explosion," organic intelligence will have merged with artificial intelligence in ways that usurp what was once the exclusive realm of human creativity. Kurzweil predicts the culmination of civilization as we know it by 2045, 35 years from now.

So how prepared are you to mutate? (I'm not.) The key idea here seems to reside in the fact that human brains are hardwired for linear progress, whereas technology progresses exponentially, which means that as computers increase in speed and power, they will take over their own development and become more autonomous. By contrast, humans will become ever more dependent, unable to function without their symbiotic connection to technology. Not a good equation, when you really think about it.

Eventually, says Kurzweil, we will do really delicious things like scan our consciousnesses into computers and live inside them as software. Things that were once fantasized as science fiction (like the possibility that computers may turn on society and annihilate us) will become absolutely real. "Nae. nae," pleaded the antropologists. "Yeah, yeah," shouted the blonde.

By 2045, Kurzweil estimates that the quantity of artificial intelligence created will be about a billion times the sum of human intelligence that exists today--to the point where Singulatarians "cannot believe you're walking around living your life and watching TV as if the artificial intelligence revolution were not about to erupt and change absolutely everything."

Meanwhile, in case you're worrying about those Facebook and Twitter revolutions convulsing the Middle East, wondering how it will all end, you might want to consider a stint at the three-year-old Singulatarian University, co-founded by Kurzweil and sponsored by NASA and Google, where interdisciplinary courses for graduate students and executives teach you to evaluate the power and speed of computers, and to track the pace of technological progress in the future.

To grasp the way technology affects our perception of reality and even how we think, Joseph Weizenbaum (in his 1976 book "Computer Power and Human Reason") suggests how technologies from the past, like the map and the clock, became part of "the very stuff out of which man builds his world." Once adopted, he argues, technologies tend to become so indispensably integrated with, and mirrored by, neural structures that they can not be abandoned without fatally impairing the whole system, and plunging society into "great confusion and possibly utter chaos."

Blurring the boundary between brains and machines is an "irreversible commitment." Do you know what that means? Once the double doors screech open and cybernetic blurring renders human brains and computers inseparable, we will no longer know for sure who (or what) is doing the programming. By then, however, there will be no going back. The simply and straightforwardly human will have been lost in the confusion. I can only imagine that some people find the prospect of Kurzweil's 2045 a lot more palatable than I do. Luckily, I will be under the sod by then.

No comments: