My eldest child taught herself to code with books she checked out of the library. She learned five different computer languages the summer between fourth and fifth grade. Now she works for a major software company, exploring the possibilities and challenges of artificial intelligence (AI). She hopes to shape privacy policies and practices that maximize the good and regulate potentially harmful activities.
Digital culture and AI are significant aspects of children’s lives. They affect kids directly through gaming, educational technologies, shopping, social media, smart toys, digital assistants, and other interactive tools. They also affect children indirectly because of how social policies and practices are shaped by digital algorithms. The ways that parents and caregivers engage online has implications for kids too.
There are different types of AI as well. Generative AI invites children (and adults) to create something out of the abundance of source materials available online. It encourages kids to be creative and helps them design personal projects and pull together reports for school. It’s more imaginative than predictive AI, which gathers copious amounts of data about something (e.g., weather, purchases, people’s opinions) and then predicts trends and interests.
Conversational AI – digital assistants like Alexa, Siri, Google Assistant, ChatGPT, etc. – interact with children in human-like ways. They combine predictive activities with spoken and/or written language that mimics conversation with another person. They also build on language learning models (aka large language models or LLMs), which specialize in analyzing huge datasets (like languages or medical data) and then transcribing or translating material.
Despite our ambivalence, AI can be used in spiritually healthy ways, i.e., to promote imagination, positive social values, relationality, creativity, and a positive self-image. AI smart toys encourage interactive play and spark children’s imagination. Many are designed to promote learning, which contributes to a child’s positive self-image. They can also function like loveys and imaginary friends by providing companionship and helping a child practice social skills.
A digital assistant like Alexa, Siri, or ChatGPT provides access to information that supports a child’s goals. Kids can ask for advice on how to make and keep friends, make a special meal for a family member, or learn how to juggle. They can formulate queries to help them gather data for a project and expand their knowledge about current events.
However, children need reminders to verify AI-generated information. They may not realize that AI ‘hallucinates’ (makes up information) some of the time. They also may not know how to check AI’s sources. Adults can reinforce basic media literacy skills like verifying new information against known and trusted sites and human sources.
Kids may also be confused by the ways AI assistants are and are not like humans. Virtual assistants use human-like language to communicate but are less capable of picking up on social-emotional cues. If a child shares their feelings with a smart toy or asks Google to solve a relational problem, these AI agents may not pick up on their sadness, anger, fear, or joy when responding. This can leave a child feeling unheard when other cues suggest a relational connection.
Comments