New computers will be sensitive to your emotions, which leads Wendy Wolfson to wonder about our relationship with robotic pets--especially when combined with the human tendency to hack.
I've never wanted to keep a dog in the city. The idea of following behind a dog with a plastic baggie repulses me. Nor do I want to encourage the fantasies of the woman who dreamed of a special pistol to mark offending dogs with a jet of red ink.
Yet, as a believer in healthy obsessions, I've seriously considered getting a pet. I've considered several options. One friend waxes ecstatic about her tidy guinea pigs, and they certainly do bring some owners to the point of obsession. But I think I've found myself a better solution: a neat little robotic friend.
Japan has a decades-old tradition of developing robots for entertainment. The home of Godzilla has produced numerous alternatives for those who seek low-maintenance relationships, such as Tamagotchis, little egg-shaped virtual pets that require constant care and nurturance lest they die, and the classic Furby. Now, potential dog lovers who don't want the hassle of cleaning up after their pet have Sony's AIBO robodog.
The AIBO, which stands for Artificial Intelligence Bot, has seeing eyes and stereo ears. It also has programmed instincts and emotions. Sony's newest version, the ERS-210, has more touch sensors, greater freedom of movement, and the ability to recognize up to 50 simple words as it "grows." Several Memory Stick-based programs are available, including a basic "AIBO Life" app, which lets your robodog mature from infancy to adulthood. (Cheap knockoffs include Poo-Chi and Tekno the Robotic Puppy.)
Are We Ready For Robotic Relationships? Clearly, we humans have a strange idea of what is alive and what isn't. Apparently, we're able to make remarkable leaps of faith in imbuing objects with perceived personalities.
To date, robots have been technology-driven rather than human-factors driven. On the horizon are new generations of computers that will be sensitive to your emotions. In a scientific quest to learn more about ourselves--a fundamental goal of A.I.--the people at the MIT AI Lab Project, among others, are developing affective machines and computers, imbued with rudimentary emotional intelligence and computer-assisted learning applications. But as affective computing develops, it may become the underpinning of a new generation of robotic companions and toys.
I wanted to see it for myself, so I dropped by the MIT Artificial Intelligence Lab. Aaron Edsinger, a researcher on the Humanoid Robot Team, introduced me to Kismet and Cog.
A "sociable robot," Kismet, is programmed to demonstrate humanlike expressiveness and interact with people like a growing toddler. Kismet was totally cute, with big eyes, rudimentary eyelashes, and perky Furby-like ears. On the table in front of Kismet lay a C++ book and colorful stuffed toys.
Cog, on the other hand, doesn't exude cuteness; he towered over me, slightly menacingly--and reminded me of the Terminator. Incongruously, hulking Cog also was surrounded by colorful blocks and stuffed toys. Both robots were dormant, but Cog's vision was active. The wall behind him was lined with a mass of video screens, connected to his eye cameras.
The responses evoked in us by eye contact and a few gestures are as fascinating as the inner workings of the robots themselves. Why would people spend hours trying to feed and relate to a Tamagotchi? In our heads, we know they're just machines. But it is remarkably easy to suspend disbelief and interact with a disembodied robot head.
Edsinger explained that researchers use a whole bag of tricks to get people to "entrain" emotionally with the robots. The researchers programmed the robots to react to people using criteria including shape, skin color, and differentiation from the environment.
Kismet certainly has the cues to make a human melt: large infant eyes, a round face, pointy ears, and a high-pitched squeaky voice. The elements are primitive, the long eyelashes stuck on haphazardly. But the robot's expressiveness works a powerful magic. Kismet is trained to learn from humans' voice-inflection pattern, just as an infant can pick up general intonations.
I watched the researchers design the next-generation Kismet head. While its metal skull will have skin, its features are merely indicated.
When In Doubt, Go Cute And Furry
When designing anthropomorphic playmates, the trick is not to make them too real. We seem to judge the things that are supposed to look like us by harsher perceptual criteria. Robotcists have learned that, to be better accepted, interactive machines need to be cute, or they must have clearly representational non-human features.
Perhaps it's because in trying to make the thing appear too human nobody gets the proportions quite right. The effect seems subtly creepy--like Fumio Hara's slightly cross-eyed decapitated head computer interface. My Real Baby, which irobot billed as the most technologically advanced baby doll around, had a great machine-intelligence underpinning, but some feel that the baby face suffers from the creepiness factor as well.
Interactive computers and robots, and our responses and reactions to them, also become reflections of our cultural and gender differences. A robotic engineer remarked that the developers and users of gaming software have tended to be boys. Cog represents the classic male-built 'bot, but Kismet was built by a woman, and their forms and function reflect it, says the engineer. The next generation of interactive computers and robot makers, designed and used by more girls, will combine qualities that women value in terms of emotional and interpersonal connection.
"I certainly would like to see a more even gender balance in the field so that future robots can benefit from the issues/perspectives that both genders have to offer while avoiding needless gender stereotypes," Cynthia Breazeal, Kismet's designer, told me. "It's important to think about these issues from the human side as well as the technology side."
One disadvantage to interactive computers is that they could become unintentionally annoying. I already have a relationship with my computer. I spend more hours a day sitting in front of it than conversing with friends. I don't want any back talk. If my interactive computer has a bad hair day, instead of being simply capricious, will I have to bring it to the repairman for an attitude adjustment?
Aside from the human desire to nurture just about anything mechanical that's cute, or squeaks, never underestimate the impulse to hack.
Just as we humans need to nurture guinea pigs, we need to hack Furby innards. We need to tinker, improve, and customize. Whether by breeding poodles, customizing code, or adding hydraulics and a whomping stereo system to our low riders, our unruly human creativity could conceivably turn out some very interesting--and unauthorized--computers.
It's already begun. I heard about a game site that was hijacked because participants started breeding the virtual characters for all sorts of pathologies. And, to Sony's consternation, intrepid AIBO owners have taught their robotic dogs some new tricks. Sony shut down the aibohack site by threatening legal suit, but you can still see bootleg movies of AIBOs disco dancing.
Reach Out And Touch Someone
Humans have been hacking the genetic code of domesticated animals for millennia. Your pet dog has been bred to interact with you. All that tail-wagging may only be eliciting your emotional cues so you will feed him and meet his needs. But hopefully your dog can distinguish you from the mailman.
When is sort of alive not enough? Does extensive playing with computer toys encourage a sort of solipsism, as you project your emotions to an object programmed to elicit your responses?
As I passed in front of Cog's camera eyes at the MIT AI Lab, I could see myself as the robot saw me on his video screens, targeted by a red box. The robot's machine vision system differentiated me from the background by skin color, movement, and shape.
But Cog doesn't recognize that I'm human. The robots don't remember the cumulative effect of their socialization. While both Cog and Kismet have different ways of distinguishing people from the great blob that is their world, they have no concept of the human "other." To a rudimentary degree, Cog may be able to tell me apart from the stuffed red rabbit lying in front of him, but the familiarity is nothing personal. Your AIBO thinks you're a moving shape, but attaches no importance to you.
Edsinger told me that Cog once spontaneously reached out and touched him. He knew it was just a random movement, but he couldn't help himself. A part of him wanted to believe otherwise. Says Edsinger, "They don't know we are here."
Does the thought of a robotic pet make you feel warm and fuzzy, or does it creep you out? Are you ready to cuddle up with your own personal R2D2? Do you think we're ready for an emotional relationship to a bucket of bolts? Make your own pet sounds in Wendy Wolfson's discussion forum.
IT's Reputation: What the Data SaysInformationWeek's IT Perception Survey seeks to quantify how IT thinks it's doing versus how the business really views IT's performance in delivering services - and, more important, powering innovation. Our results suggest IT leaders should worry less about whether they're getting enough resources and more about the relationships they have with business unit peers.
What The Business Really Thinks Of IT: 3 Hard TruthsThey say perception is reality. If so, many in-house IT departments have reason to worry. InformationWeek's IT Perception Survey seeks to quantify how IT thinks it's doing versus how the business views IT's performance in delivering services - and, more important, powering innovation. The news isn't great.
InformationWeek Must Reads Oct. 21, 2014InformationWeek's new Must Reads is a compendium of our best recent coverage of digital strategy. Learn why you should learn to embrace DevOps, how to avoid roadblocks for digital projects, what the five steps to API management are, and more.