Growing up with artifcial intelligence: Implications for child development

Ying Xu, Yenda Prado, Rachel L. Severson, Silvia Lovato, Justine Cassell

Research output: Chapter in Book/Report/Conference proceedingChapterpeer-review

Abstract

Artificial intelligence (AI) technologies have become increasingly integrated into children's daily lives, influencing learning, social interactions, and creative activities. This chapter provides an overview of key research fields examining children's learning from, interactions with, and understanding of AI. Current research indicates that AI has the potential to enhance children's development across multiple domains; however, ethical considerations need to be prioritized. When children engage in learning activities with AI, they may encounter inappropriate, inaccurate, or biased content. Additionally, children's social interactions with AI may affect their approach to interpersonal interactions. Finally, children's developing understanding of the world may make them particularly susceptible to attributing human-like properties to AI, undermining their expectations of these technologies. This chapter highlights the importance of future studies focusing on a child-centered design approach, promoting AI literacy, and addressing ethical concerns to fully harness AI's potential in child development. Recommendations for parents, technology developers, and policymakers are also provided.

Original languageEnglish
Title of host publicationHandbook of Children and Screens
Subtitle of host publicationDigital Media, Development, and Well-Being from Birth Through Adolescence
PublisherSpringer Nature
Pages611-617
Number of pages7
ISBN (Electronic)9783031693625
ISBN (Print)9783031693618
DOIs
StatePublished - Dec 5 2024

Keywords

  • Artificial intelligence
  • Child development
  • Communication
  • Learning
  • Policy

Fingerprint

Dive into the research topics of 'Growing up with artifcial intelligence: Implications for child development'. Together they form a unique fingerprint.

Cite this