We're headed toward (A) a technological Eden or (B) a wasteland of ashes and goo. And please put your response in the form of a question.
I've decided on a new career: Futurist.
There’s no real downside. If your predictions come true, you’re a genius; if they don’t, you’ll probably be dead by then anyway, so your publisher can’t ask for royalties back. At least, I don’t think it can.
Except I keep stumbling over the most elemental prediction: Is the future good or bad?
That occurred to me when I read an interview with artificial-intelligence expert and noted futurist Ray Kurzweil by my colleague Sharon Gaudin (see "Kurzweil: Computers Will Enable People To Live Forever"). In it, Kurzweil talks about the imminent merger of man and machine, how robotic parts will prolong life, nanobots in the bloodstream will cure disease, and virtual worlds will substitute for the real one. He detailed that vision in his book, The Singularity Is Near: When Humans Transcend Biology (Viking Adult, 2005).
Don’t get me wrong: Kurzweil is a brilliant man. I wouldn’t go up against him in Jeopardy.
But several of the points Kurzweil makes remind me of a song called "In The Year 2525." A hymn to technological encroachment, it hit No. 1 on the Billboard chart in the summer of 1969. It starts out like this: "In the year 2525, if man is still alive, if woman can survive, they may find ..." then goes on to list humans’ increasing dependence on technology, such as robotic limbs, virtual worlds, nanobots, etc. That’s a liberal interpretation of the lyrics, but stay with me.
The irony is that the two prognosticators arrive at the same points, but different conclusions. While Kurzweil is an optimist, a virtual-glass-half-full guy, the writer of "2525" is more suspicious.
These developments are closer than one might think (me, I mean; not Kurzweil). Recently, the virtual world known as Second Life has attracted a great deal of attention. In Second Life, virtual avatars interact in lifelike ways, and, as Second Life’s popularity has increased, more and more real-world businesses are setting up shop there.
Unfortunately, a worm infected Second Life two weeks ago and forced its creators, Linden Labs, to shut it down for a while ("Worm Knocks Second Life Offline"). The worm was called "gray goo," and it’s a telling reference. The gray-goo theory refers to what might happen if self-replicating nanobots get out of control and reduce the world to ... well, you get the picture. Sun co-founder Bill Joy, certainly no dummy, subscribes to the gray-goo theory.
A popular, and highly regarded, television show called Battlestar Galactica posits a future in which a decimated human population fights for survival against the robots they created. A surprise best-seller this year is a book called The Road, by Cormac McCarthy (Knopf, 2006), about the journey of a father and son through an ash-filled, postapocalyptic world. Not that popular culture is any great predictor, but a lot of people seem to be sensing something ill in the wind.
On second thought, maybe I don’t want to be a futurist. I have a hard time predicting where my shoes will be in the morning, much less what life will be like 100 years from now. When my prognostications turn out all wrong, I’ll still be alive and my publisher will definitely demand royalties back. That is, if Ray Kurzweil has it right.
Otherwise, it’s all ashes and goo anyway.
Ha! Just kidding. How’s your future looking? Send your best predictions, and industry tips, to email@example.com or phone 516-562-5326.
To discuss this column with other readers, please visit John Soat's forum.
To find out more about John Soat, please visit his page.
5 Top Federal Initiatives For 2015As InformationWeek Government readers were busy firming up their fiscal year 2015 budgets, we asked them to rate more than 30 IT initiatives in terms of importance and current leadership focus. No surprise, among more than 30 options, security is No. 1. After that, things get less predictable.
InformationWeek Tech Digest, Nov. 10, 2014Just 30% of respondents to our new survey say their companies are very or extremely effective at identifying critical data and analyzing it to make decisions, down from 42% in 2013. What gives?