Georgia Tech
Researchers are programming robots to understand when it gains a human's attention.
By
updated 3/17/2011 6:13:52 PM ET 2011-03-17T22:13:52

Instead of grabbing your attention by attempting to take over the world, robots in the future might just wave their arms at you, researchers suggest.

Scientists are now programming robots to understand when they have gained a person's attention. The hope in the long run is to help robots interact with humans the way people do with each other.

"We would like to bring robots into the human world," said researcher Aaron Bobick, a roboticist specializing in robot vision at the Georgia Institute of Technology. "That means they have to engage with human beings, and human beings have an expectation of being engaged in a way similar to the way other human beings would engage with them."

The researchers began with the robot Simon from roboticist Andrea Thomaz's lab at the Georgia Institute of Technology. The droid was designed to explore side-by-side interactions between humans and robots.

"Simon is a humanoid-torso robot, pretty close to the size of a smaller adult, except with no lower body and a disproportionately bigger head than the body might entail," Bobick said.

The investigators wanted to see if Simon could tell when the robot had successfully caught the eye of a person in the middle of a task.

"People were given a couple of tasks, such as talking on a cell phone, playing with blocks, or working on a Rubik's Cube," Bobick explained.

Simon would then make a gesture of some kind, such as waving one of his hands or beckoning them closer. "The computer vision task was to try to determine whether or not you had captured the attention of the human being," Bobick said.

Simon observed a person for several seconds before the wave and up to three seconds afterward, looking for changes in behavior. "Maybe people would wave back, or change the direction they were looking," Bobick said. "Simon would look at what their motions across their bodies were like, and if there were any deviations in the patterns of behavior they showed before and after the signal that Simon gave."

With close to 80 percent accuracy, using only his cameras as a guide, Simon was able to tell whether someone was paying attention to him or ignoring him.

"Whether its purpose is to assist the elderly or to help a human with manufacturing and assembly, natural interactions between robots and human beings are necessary for the collaboration to be fluid," Bobick told TechNewsDaily. "This ability to understand human reactions is part of our vision."

Bobick and his colleagues are improving Simon's ability to get and understand attention by incorporating other details the robot can look for, such as a change in the direction a person gazes. "Fundamentally, human behavior is quite complex and often subtle," Bobick said. "This makes the perception challenging."

Future work can also explore what Simon might do if he does not seem to get a person's attention the first time. For instance, perhaps he can wave again.

One difficulty concerning their work is that Simon's motions are not yet really human-like.

"For example, Simon moves more slowly than one would expect a human to move," Bobick said. "This sometimes causes humans to also move slowly or in a more stilted fashion, which of course sometimes reduces our ability to detect the behavioral change. As Simon moves more naturally, we can expect people to react more naturally, which will likely make the perception easier."

Bobick and his colleagues detailed their findings March 8 at the Human-Robot Interaction conference in Lausanne, Switzerland.

© 2012 TechNewsDaily

Discuss:

Discussion comments

,

Most active discussions

  1. votes comments
  2. votes comments
  3. votes comments
  4. votes comments