Robots can influence children, even when they are wrong, researchers reported Wednesday.
A pair of studies show that children respond strongly to robots, especially when they are small and cute. That can be good, the team at the University of Plymouth in Britain found. But it can also be troubling, because children were more likely than adults to give an incorrect answer to a puzzle if they saw robots giving wrong answers.
“It’s a bit sinister, isn’t it? Children succumb to peer pressure from robots,” said robotics professor Tony Belpaeme, who helped lead the study team.
“We were kind of expecting to find something, mainly because we have been working with children and robots for a long time.”
The team had been doing experiments with children to see if robots could act as coaches, helping kids lose weight, study math or take better care of health conditions such as diabetes.
“Every time, we saw that children are receptive to robots,” Belpaeme told NBC News.
The team decided to re-create a well-known psychology experiment first performed in the 1950s. Called the Asch paradigm, it has shown that people will give incorrect answers if they see people around them do.
Get breaking news and insider analysis on the rapidly changing world of media and technology right to your inbox.
The experiment itself is simple. People are asked to make a simple visual comparison, such as deciding which two lines match out of a batch. When alone, people rarely make a mistake, Belpaeme said. In the classic experiment, human accomplices come in and deliberately give incorrect answers in front of the subject, who often gives wrong answers, too, in response to the peer pressure.
The team set up a similar experiment using adults, children aged 7 to 9, and robots. “We thought, wouldn’t it be great to try that again with robots and see if adults succumb to peer pressure by robots and if children do, too,” Belpaeme said.
“The adults don’t. They can resist. But the children do.”
When children were alone, they competed the task correctly 87 percent of the time, the team reported in the journal Science Robotics.
But when robots were present and gave incorrect answers, the kids’ scores fell to 75 percent, and the wrong answers almost always matched those given by the robots.
“It’s a bit of a warning,” Belpaeme said.
“Robots are going to be this new channel,” he added. “They are going to be like social characters in your house. They could use that to convince you to make purchases that you probably don’t need or do things that you probably wouldn’t.”
The children were probably more susceptible for two reasons, he said. “The robots were quite small and cute,” he said. They have large eyes and are programmed to respond in a way that looks positive when people enter a room.
“Children readily suspend disbelief,” Belpaeme added. “If a child is playing, a doll isn’t just a doll. It comes alive when they are playing. It is the same thing with robots. They don’t see a robot just as plastic and electronics. They see a robot as a character.”
Adults are a little more immune, Belpaeme said, although the team said adults might respond differently to larger robots that are less cute.
He doesn’t see the potential use of robots as all bad.
“You can use it for good. Imagine that a child wants to lose some weight. You could bring in a weight loss robot. It would be very convincing,” Belpaeme said.
A robot might offer nonthreatening suggestions to eat fruit instead of candy, or encourage a child to take a walk, he said. In a separate review in Science Robotics, he found that robots could be effective tutors, especially for physical tasks such as handwriting or throwing basketballs.
“But we all know the problem that could occur when there’s money to be made,” Belpaeme said.
People already invite technology into their homes with voice-controlled smartphones and devices such as Amazon’s Echo, with its Alexa character.
These digital interfaces are dry, but robots are being made in increasingly sophisticated ways to interact with people, Belpaeme said. “We need to be careful,” he said.