Is your kid friends with Alexa? A psychologist on what that means for their development
...
MACH: Toy robots are nothing new. What is it about the new robots that concerns you?
Turkle: Toy robots of the past presented themselves as machine-like. Indeed, when I was a little girl I played with a robot that I built, a “Mr. Machine.” Its machine nature meant that it did not have emotions. The robotic toys on the market today are put out there to play on the question of their emotional status. Jibo, for example, presents itself as a friend. Its claim to fame is that it shows emotion and a capacity for relationships.
When a robot or home assistant comes with an added, explicit dose of personality, when it behaves as though it has an emotional life, the capacity for empathy, it is being deceitful. It has these capacities in an “as if” way. It’s the “as if” promise of relationship that concerns me.
By “as if,” do you mean they pretend to offer something they can’t?
Yes. If children learn to respond to “as if” empathy, we are not preparing them for the complexity, nuance, negotiations of true empathy, true listening. There are skills of listening, of putting oneself in the place of the other, that are required when two human beings try to deeply understand each other.
Not only can’t you practice relational skills by talking to machines, but you make negative progress. For example, a machine always has a response ready. You never have to wait, to attend to silences or to what one young woman I interviewed called the “boring bits” in conversation. We can forget the kind of listening and the kind of talking about our feelings that real conversation requires.
Stuffed toys don’t feel emotion either. How is playing with a teddy bear different from playing with a robot?
A teddy bear is an object that does not make any pretense to having its own emotion. This means that children are free to relate to it using the psychology of projection. An example I like to use is this: If a young girl has just broken her mother’s crystal, she might put her teddy bear in detention. Children are really using the object to engage with their own feelings.
A Jibo or Cozmo constrains this kind of play; they have feelings of their own. Children relate to them with the psychology of engagement.
To what degree do personal assistants like Siri or Alexa, which lack the physical features of a robot, raise the same issues? Children have been known to form relationships with these disembodied voices.
Personal assistants such as Siri or Alexa raise the same issues. Mattel, in trying to market a personal assistant for children [in 2017] — something called Aristotle, which never made it to market because of privacy concerns — admitted “Children will form relationships with Aristotle. We just hope they are the right kind.”
We are confronted with ever-more sophisticated examples of artificial intimacy. And yet there are no right kinds of relationships here. Aristotle, like Jibo, like Alexa, like Siri, like Cozmo, cannot be in a “relationship” with your child. They are empathy machines that can only put children in a position of pretend empathy. And pretend empathy will never teach the real kind.
...
Connect With Us