The following is part of MIT Engineering's Ask an Engineer series.
We can create robots that look like they have feelings, says Rosalind Picard, a professor of media arts and sciences at MIT and founder and director of the Affective Computing research group at the MIT Media Lab. The thing is—we’re talking appearances, not reality.
In 2001, an engineer from Sony, for example, built a dog named Aibo. It looks happy when you come home and sad when you scold it. As Picard says, a person might think, “Gee, my dog has feelings.” The truth, however, is that Aibo responds to a computer program, not you. It matches a set of inputs with an automated output. “Aibo has a robotic tail, for instance, and the tail is programmed. The owner greets it and the program runs the wagging tail movement. Or the owner says ‘bad dog!’ and the program stops the tail,” she says. “The dog does not have any conscious experience or feelings that go with being happy or sad.”’
In 2008, the researchers made Shybot, a personal mobile robot designed to both embody and elicit reflection on shyness behaviors. It was a favorite with the public. Using face detection and proximity sensing, the robot was designed to react to human presence and familiarity. It categorized people as friends or strangers. “When a person came near it, it scurried backward unless it really got to know you,” says Picard. “Depending on your approach, it could look like a nervous, anxious robot.” Read more at Ask and Engineer.
Authored by Meg Murphy. Thanks to Abhik Maji, from India, for this question. Visit the MIT School of Engineering’s Ask an Engineer site for answers to more of your questions.