Kids & AI Relationships

A friend shared that her preschooler likes to ask Alexa questions. The way Alexa’s speaker lights up in response to her voice is part of the appeal. So is the fact that Alexa seems to like to answer questions. Alexa is never unwilling to respond to a young child’s queries. Even when the AI chatbot doesn’t understand, it patiently waits for more information and tries again.

Children now live in a world where digital friends are an everyday reality. Some experts worry that kids may find it difficult to distinguish between ‘real’ and ‘artificial’ humans. Their voices sound much the same. They use similar vocabulary and sentence structure. They are present in homes and interacting with lots of people children know. Over time, they can begin to feel like good friends or extended family.

This is when things become problematic. AI chatbots are digital tools, not human companions. They are unable to understand the nuances of human communication and may miss social emotional cues that other humans would take into account. If children see digital and human companions as interchangeable, they may not develop all the social skills needed to support healthy relationships with others.

Some psychologists and educators suggest that one way to help kids maintain some distance from AI tools is to avoid using personal pronouns for AI assistants. Instead of using ‘she’ for Alexa, use ‘it’. Explain that Alexa and Google are machines, not people, even though they have human-like qualities. We use them to help us accomplish tasks or to gather information.

Using ‘please’ and ‘thank you’ with AI tools is also confusing for kids. Such polite language suggests that they are interacting with a conversation partner who deserves social niceties. It’s treating a machine as if it is a human being rather than a computerized voice. We don’t use such language when using other tools like hammers, toasters, or vacuums. Yet a recent study found that 70% of the elementary children interviewed didn’t want to be ‘rude’ when interacting with Alexa. Parents and caregivers can refute this misapplication of social etiquette.

Children also need to know that chatbots can talk about emotions but cannot feel them. They are unable to love, or empathize, or experience joy. They cannot feel butterflies in their belly or be afraid of big dogs or cry when sad things happen. This significantly limits an AI assistant’s ability to relate to human experiences. It also prevents a chatbot from being a genuine friend.

To emphasize this and other distinctions, play a game where you take turns identifying different things as ‘human’ or ‘machine’. Name a characteristic, such as “loves to watch a sunset,” “is really fast at looking up information,” or “gives good hugs” and then invite children to say whether they think that aspect is usually associated with a human or a machine. If some actions could be either one, challenge kids to explain how they would tell the difference. For example, if a robot and a person can both give good hugs, what would be different about each kind of hug? Or if a person can quickly look up information on their phone, how is that different from Alexa doing the research?

Share

Comments

Leave a Reply

Your email address will not be published. Required fields are marked *