Will Our Love Of 'Imperfect' Robots Harm Us?

Flawed robots make people more comfortable in certain settings, which is fine. But what happens when we need robots to be perfect?

David Wagner, Executive Editor, Community & IT Life

October 16, 2015

4 Min Read
<p align="left">(Image: <a href="https://en.wikipedia.org/wiki/Jonathan_Harris#/media/File:Lost_in_Space_Jonathan_Harris_%26_Robot_1967.jpg" target="_blank">CBS Television</a> via Wikipedia)</p>

These 8 Technologies Could Make Robots Better

These 8 Technologies Could Make Robots Better


These 8 Technologies Could Make Robots Better (Click image for larger view and slideshow.)

We are drawn to robots that have the same kind of cognitive biases and flaws that we do, according to a report from researchers at the University of Lincoln in the UK. Because of this, we may need to consider making robots less perfect in order to build positive, long-term relationships between humans and robots.

This is an especially important finding considering Gartner recently predicted that by the end of 2018, 3 million people worldwide will have a robot for a boss. If we will soon be interacting with robots at work, even having some of them ordering us around, is it a good idea to make them less perfect to make us comfortable? 

The University of Lincoln researchers, who presented their findings at the International Conference on Intelligent Robots and Systems (IROS) conference in Hamburg earlier this month, didn't tackle that specific question. Instead, they focused on robots used in education for children on the autism spectrum and those that support caregivers for the elderly.

The researchers introduced the cognitive biases of forgetfulness and "empathy gap" into two different robots: the ERWIN (Emotional Robot with Intelligent Network), which can express five basic emotions, and the small yellow robot toy Keepon that's been used to study child social development. In both instances, half the interactions with these robots included cognitive biases and half of them did not.

Overwhelmingly, human subjects said they enjoyed a more meaningful interaction with the robots when machines made mistakes.

"The cognitive biases we introduced led to a more humanlike interaction process," Mriganka Biswas, the lead researcher explained in a press release. "We monitored how the participants responded to the robots and overwhelmingly found that they paid attention for longer and actually enjoyed the fact that a robot could make common mistakes, forget facts and express more extreme emotions, just as humans can."

He went on to say something a little more controversial, in my mind: "As long as a robot can show imperfections which are similar to those of humans during their interactions, we are confident that long-term human-robot relations can be developed."

Granted, this study was on children and the elderly. The needs of these groups are clearly different from those of people in an office setting. At the same time, the notion that humans enjoy seeing flaws and biases in robots because it makes them seem more like us is worrisome.

Some humans have a bias toward racism. No doubt a racist robot would be pleasing to those people. Sure, that's an extreme example. Cognitive biases take all forms, but we try to train ourselves out of as many as possible in a business setting. For instance, many people have decision-making cognitive biases like those that cause us to go with heuristic shortcuts (or gut feelings) that lead to fast, but not always accurate, decisions. Do we want robots that shoot from the hip (or look like they do)? Aren't we trying to run data-driven businesses?

For most people, exposure to robots has been limited to science fiction. We're willing to accept the android Lieutenant Commander Data from Star Trek, because he has no emotions. We're OK with him remembering everything and being faster and stronger because he's lacking something essentially human. We can handle C-3PO from the Star Wars movie franchise, because he's a coward and a bumbling fool, even though he is fluent in more than 6 million forms of communication and can calculate probability faster than humans. These flaws allow us to accept our weaknesses in front of machines that are potentially superior to us.

[What's wrong with robot masters anyway? Read 10 Reasons Why Robots Should Rule the World.]

What happens in a business setting? Do we keep the flaws in robots to make people happy or do we learn to accept our own inadequacies in the name of better business? We're not there yet. Robots aren't superior to humans.

But if Gartner is right, it isn't long until a robot gives you an order. Will you trust the order? Will you take its judgment over your own? Will it need to pretend to forget things just so you can accept its orders? Long before we have to worry about robots being our new masters, we need to think about how we will work together, side-by-side with companion robots. Daryl Plummer, a Gartner vice president and Fellow said, "In the next few years, relationships between people and machines will go from cooperative, to co-dependent to competitive."

If we can't handle being cooperative without having to dumb down robots, how are we going to handle being competitive with them?

About the Author

David Wagner

Executive Editor, Community & IT Life

David has been writing on business and technology for over 10 years and was most recently Managing Editor at Enterpriseefficiency.com. Before that he was an Assistant Editor at MIT Sloan Management Review, where he covered a wide range of business topics including IT, leadership, and innovation. He has also been a freelance writer for many top consulting firms and academics in the business and technology sectors. Born in Silver Spring, Md., he grew up doodling on the back of used punch cards from the data center his father ran for over 25 years. In his spare time, he loses golf balls (and occasionally puts one in a hole), posts too often on Facebook, and teaches his two kids to take the zombie apocalypse just a little too seriously. 

Never Miss a Beat: Get a snapshot of the issues affecting the IT industry straight to your inbox.

You May Also Like


More Insights