IE 11 is not supported. For an optimal experience visit our site on another browser.

Is your kid friends with Alexa? A psychologist on what that means for their development

Why one of the newest toy robots may not be a good birthday gift.
Touchscreen Generation: How Technology Affects Our Kids' Social/Emotional Learning And What We Can Do About It
Sherry Turkle is a psychologist at MIT and author of “Reclaiming Conversation: The Power of Talk in a Digital Age.” Rob Kim / Getty Images file

Robots long ago took up residence in factories and warehouses, and now they’re moving into bars and coffee shops — and our homes. The newest of these household robots, like Jibo, Kuri and Cozmo, come with features other appliances lack: charming personalities.

Cute little bots that recognize faces and voices and respond in endearing ways might seem innocuous enough. But Sherry Turkle is concerned — especially since they are intended to become part of children’s lives.

A psychologist at MIT and the author of “Reclaiming Conversation: The Power of Talk in a Digital Age,” Turkle worries that while people of all ages can get taken in by the illusion of caring machines, kids are particularly vulnerable. These artificial relationships, she fears, teach lessons that can interfere with children learning to truly understand and connect with other people.

Recently, Turkle discussed why she thinks robots threaten the development of empathy, and what parents can do, with NBC News MACH’s Wynne Parry. The interview, conducted by phone and email, has been edited for clarity and brevity.

MACH: Toy robots are nothing new. What is it about the new robots that concerns you?

Turkle: Toy robots of the past presented themselves as machine-like. Indeed, when I was a little girl I played with a robot that I built, a “Mr. Machine.” Its machine nature meant that it did not have emotions. The robotic toys on the market today are put out there to play on the question of their emotional status. Jibo, for example, presents itself as a friend. Its claim to fame is that it shows emotion and a capacity for relationships.

When a robot or home assistant comes with an added, explicit dose of personality, when it behaves as though it has an emotional life, the capacity for empathy, it is being deceitful. It has these capacities in an “as if” way. It’s the “as if” promise of relationship that concerns me.

By “as if,” do you mean they pretend to offer something they can’t?

Yes. If children learn to respond to “as if” empathy, we are not preparing them for the complexity, nuance, negotiations of true empathy, true listening. There are skills of listening, of putting oneself in the place of the other, that are required when two human beings try to deeply understand each other.

Not only can’t you practice relational skills by talking to machines, but you make negative progress. For example, a machine always has a response ready. You never have to wait, to attend to silences or to what one young woman I interviewed called the “boring bits” in conversation. We can forget the kind of listening and the kind of talking about our feelings that real conversation requires.

JAPAN-SONY-ELECTRONICS-GAME
Children play with their pre-ordered Sony robot dogs dubbed "Aibo" after its birthday ceremony in Tokyo.Kazuhiro Nogi / AFP/Getty Images

Stuffed toys don’t feel emotion either. How is playing with a teddy bear different from playing with a robot?

A teddy bear is an object that does not make any pretense to having its own emotion. This means that children are free to relate to it using the psychology of projection. An example I like to use is this: If a young girl has just broken her mother’s crystal, she might put her teddy bear in detention. Children are really using the object to engage with their own feelings.

A Jibo or Cozmo constrains this kind of play; they have feelings of their own. Children relate to them with the psychology of engagement.

To what degree do personal assistants like Siri or Alexa, which lack the physical features of a robot, raise the same issues? Children have been known to form relationships with these disembodied voices.

Personal assistants such as Siri or Alexa raise the same issues. Mattel, in trying to market a personal assistant for children [in 2017] — something called Aristotle, which never made it to market because of privacy concerns — admitted “Children will form relationships with Aristotle. We just hope they are the right kind.”

We are confronted with ever-more sophisticated examples of artificial intimacy. And yet there are no right kinds of relationships here. Aristotle, like Jibo, like Alexa, like Siri, like Cozmo, cannot be in a “relationship” with your child. They are empathy machines that can only put children in a position of pretend empathy. And pretend empathy will never teach the real kind.

How does a child’s age factor in? Is there an age at which interacting with social robots becomes less problematic in your view?

I am trying to stress that we are all at risk.

Of course, the younger the child, the more worrisome the interaction. But in my studies of children from five to 14, older children were able to engage with sociable robots in ways that I find troubling. I see a particular vulnerability in adolescents. One of the stories I tell in "Reclaiming Conversation" is of a mother and daughter. The mother, 40, has a 10-year-old, Tara, who tends to be a perfectionist, always the “good girl.” Tara expresses anger to Siri that she doesn’t show to her parents or friends.

Stephanie wonders if this is “perhaps a good thing, certainly a more honest conversation” than Tara is having with the adults in her life. But what Tara is having with Siri is not a conversation at all. No one is listening. My worst fear: If Tara can “be herself” only with a robot, she may grow up believing that only an object can tolerate her truth.

Jibo new social robot, developed by NTT and Everist, during the Mobile World Congress Day
Jibo is a household robot that "presents itself as a friend," Turkle said. Joan Cros / Sipa USA via AP

What recommendations do you offer to parents?

Don’t model behavior with digital assistants in which they become confidants, even in fun. Make it part of digital literacy to stress that machines are not people and why. Talk to children about what machines can’t give them.

Children understand that their teddy bear is a toy that they can make into everything. A digital doll that talks back has a mind of its own. Tell your child that you want him/her to play with toys that they can make into anything they have in their imagination.

And then, create lots of quiet time where your child can talk to you. In little bits. By raising a topic one day and continuing it the next. People keep telling me that robots are wonderful with children because the robots have patience. You can have patience as well! Remember, robot-patience is a faux virtue because it isn’t patience at all. Because it isn’t real attention to your child at all.

WANT MORE STORIES ABOUT PSYCHOLOGY?

FOLLOW NBC NEWS MACH ON TWITTER, FACEBOOK, AND INSTAGRAM.