<b>Wendy Wolfson</b> is worried about the epic race between humans and machines. As new cybertechnology chips and biosensors are developed, blurring the line between in vitro and in silico, will people be able to keep up?

InformationWeek Staff, Contributor

January 23, 2002

8 Min Read

Stephen Hawking, the noted physicist (and reputed gangsta rapper) recently fretted that, unless we start to genetically engineer ourselves, humans will not keep up with the evolving pace of computer intelligence. In an interview with a German magazine about the ethics of genetics, Hawking advocates increased use of cybertechnology to link human brains with computers. Otherwise, he advises, humans run the risk of being left behind, as intelligent machines will take over the world.

Linking our bodies to machines isn't new. For example, millions of Americans have pacemakers. Hawking depends on a machine to speak, as he suffers from Lou Gehrig's disease, a degenerative disease of the nervous system. However, chips and biosensors in development are beginning to blur the line between in vitro and in silico. Implantable living chips may enable the blind to see, cochlear implants can restore hearing to the deaf, and implants might ameliorate the effects of Parkinson's or spinal damage. Thought-operated devices to enable the paralyzed to manipulate computer cursors are being tested.

Plenty of good may be accomplished with these inventions, but I worry. Massively parallel biocomputers will consist of a puddle of cells in a bioreactor. What will happen when your biocomputer gets the flu? And "computer virus" will earn a whole new, literal meaning. (I don't even want to think about the phrase, "The blue screen of death.") The potential downside to biocomputing in the year 2030 may be eerily reminiscent of what often happens to lunches stored in today's office fridge. If the power regulating the temperature in the bioreactor gets cut off, or wild viruses infect the biofilm coating your motherboard, or the office cleaning crew gets a little too enthusiastic splashing the bleach around, your IT personnel will have to don rubber gloves and hold their noses.

Chips Ahoy
Myriad workhorse biochips are being refined for drug development, delivery, and testing, such as the joint venture between NeuralStem Inc. and MetriGenix Inc. to create neurochip sensors, the human liver cell chip for toxin testing, the implantable "pharmacy on a chip," developed by John Santini's MIT lab, that is riddled with tiny wells for chemicals to automatically deliver drugs directly into patients.

Much more fun are the mobile-phone implants espoused by the cheeky folks at Half Bakery.com, a veritable fountainhead of original (though regrettably fictitious) product ideas. The Shop by Thought device seems to be the ultimate E-commerce and data-mining killer app, although it would be especially lethal to those with little impulse control.

On the biological computing front, help may come from the humble pond snail. Researchers at the Max Planck Institute for Biochemistry in Munich, Germany, have completed a "proof-of-principle experiment" by developing the world's first snail chip. They directly interfaced a neural network with a silicon semiconductor by successfully growing snail synapses over the chip. When stimulated, the interlaced snail nerve cells transmitted electric pulses to trip a switch and make the circuit go live. Their plan is to build a system with 15,000 neuron-transistor sites--a first step toward an eventual computational model of brain activity.

Why snail brains? you may wonder. (I sure did.) Neural cells from snails and lobsters are relatively easy to see and manipulate, because they're slightly larger than human or rat neurons. Yet, being a snail brain cell wrangler requires solutions to a number of tissue engineering problems. Not only do the materials have to be neutral to biological processes, but the neurons move around as they grow connections. So the researchers caged them in a tiny pen of polymers. Getting the little buggers to stick to your substrate, feeding them, and keeping them at the right temperature is, apparently, quite an accomplishment.

Horror Vacuii
Nobody seems to intuitively have a problem with implantable devices for the blind, deaf, and impaired. However, biochips may become a (literal) invasion of privacy.

The Applied Digital Solutions "Guardian Angel" chip is implanted in thousands of household pets. Recently, however, a surgeon affiliated with the company implanted a chip in his arm and his hip to demonstrate how people with pacemakers could be scanned from up to 4 feet away.

Tracking stray cats was a promising beginning in the implantable chip business, but dismayed by the potential flak from civil libertarians, Applied Digital Solutions backed off from suggesting that its chips be implanted in small children and elders with dementia; the company is now marketing them (the chips, not the small children) as attachable devices.

Chips for pets haven't raised any hackles. But the idea of injecting chips in humans disturbs anyone concerned about the shreds of privacy we still hold. Implantable chips are the penultimate identifier, next to DNA, which is what makes them scary. The technology isn't there yet, but it will be. Future proposals to use chips to track prisoners, implantable devices in the military to enhance the abilities of soldiers, and cyberimplants allowing information workers to communicate with machines will make current concerns about digital privacy and medical information seem trifling. The potential for totalitarian mind control may be far fetched, but future biobrain implants could be like today's digital cable--all those channels, but nothing on.

In a tracking sensor application that could only have been developed by the generation raised on lemon-scented Pledge commercials, the U.S. Defense Advance Research Projects Agency is developing its own version of pixie dust for military applications, a tiny cloud of dust-mote sized sensors labeled Smart Dust. Now you know: The dust bunnies proliferating under your bed are not just fallout from your slovenly housekeeping; they're really a top-secret military exercise. The thought of your household dirt squealing on your private doings may spur you to get out the vacuum cleaner right this very minute.

Size Matters
Another aspect of the growing bioconnections between human and machine is the effect on our evolutionary process. Where will this take us? Will it change what it means to be human?

One option is to make humans smarter. Evolutionary biologist Timothy Mousseau, at the University of South Carolina, is taking the long view. Mousseau has been examining the relationships of bugs, newts, plants, and the mutational effect of Chernobyl's radiation on swallows.

"There is a growing body of evidence that human brain size, statistically speaking, is responsive to sexual selection," Mousseau said. "Over time, the evolution of large brains in humans has been a direct product of female sexual preferences. Given that sex (and reproduction) is an ongoing part of our basic biology that cannot be masked by modern medical advances, it is likely that simple Darwinian evolutionary processes will continue to effect changes in brain size in the future."

Mousseau observed that women tend to choose men with larger brain size and intellectual capacity; this process happens over a long period of evolutionary time. In other words, we still have the potential to get smarter as a species, if only women would stay focused on finding guys with bigger brains rather than using other attributes as mating criteria.

The question of what it means to be human as biological technologies and computers evolve underpins the work of Oron Catts and Ionat Zurr in Symbiotica, a collective of artists at the University of Western Australia. Symbiotica is the first group to use tissue-engineering technologies to make art. Its Fish & Chips project (initially codenamed "gefilte fish") gives "designer sushi" a whole new meaning. The Symbiotica artists grew neurons from the spinal cord of a goldfish over silicon chips with data embedded in them and hooked them up to a robotically actuated arm to draw pictures.

Is Fish & Chips just a weird art project? No way. It's a rumination on the much stranger and very real ways humans are converging with computers. The Symbiotica artists ask, "How are we going to interact with such cybernetic entities considering that their emergent behavior may be creative and unpredictable?" and "How will society treat notions of artistry and creativity produced by semiliving entities?"

Does It Wiggle?
Now that you know what condition your condition is in, you know that such devices are only a stopgap measure at best in the evolutionary story. The implants you get may enhance your capabilities, but they will expire when you do, leaving the next generation unchanged.

As we become more dependent on biotechnology, the standards of what is "alive" will be up for grabs. Take a look at The Tissue Culture and Art Project's semiliving worry dolls, cultured in a bioreactor by growing living cells on artificial scaffolds, or the Pig Wings project, which explores if pigs could fly.

Deciding who or what, exactly, is human, will be an incendiary issue in the years to come as our genetic engineering technologies progress and we go beyond implantables to actual germ-line genetic modification. We are already creating chimerical creatures by combining genes from different species. We will try to engineer improved human beings--not because we're so concerned about the intelligent machine life we are creating, but because we're human, and it's embedded in our nature to explore, tinker, and create.

Perhaps MC Hawking does indeed have the last word.

Wendy Wolfson and her spying dust bunnies live near Boston. You can share your bioengineered fears with her in Wolfson's discussion forum.

Never Miss a Beat: Get a snapshot of the issues affecting the IT industry straight to your inbox.

You May Also Like


More Insights