Emerging Technologies

Why AI companies should develop child-friendly toys and how to incentivize them

AI enabled smart toys can exhibit highly positive effects on children’s development when designed responsibly.

AI enabled smart toys can exhibit highly positive effects on children’s development when designed responsibly. Image: UNSPLASH/Robo Wunderkind

Karolina La Fors
Generation AI Fellow, World Economic Forum and DesignLab University of Twente
Benjamin Larsen
Lead, Artificial Intelligence and Machine Learning, World Economic Forum

Listen to the article

  • Smart toys are proliferating rapidly across the globe, displaying new risks and opportunities for the youngest generation.
  • Several emerging regulations set minimum requirements for smart toy manufacturers in areas such as cybersecurity and online safety.
  • Where regulations may have limitations, initiatives such as the Smart Toy Awards actively incentivize smart toy developers to set children's wellbeing and developmental needs as their utmost interaction design priority.

AI-enabled smart toys come in an exponentially growing diversity and inhibit the most known social environments of children. The Market Research Future group, for instance, projects that the global market share of such toys will grow by 26% and reach 107.02 billion USD by 2030. Through their growing interaction with children, transferability across contexts, connectivity to other AI-enabled devices, and the ways in which children unconsciously entangle with them, AI-enabled toys regularly impact the upbringing of the youngest generation.

Have you read?

Exposure to online safety risks with smart toys

Smart toys, as highlighted by the Generation AI Initiative of the World Economic Forum, can exhibit highly positive effects on children’s development when designed responsibly. Complaints received by the Federal Trade Commission (FTC) as well as an investigation of the Norwegian Consumer Organisation and rich scholarly research, however also point to the severe impacts that smart toys can have on children’s development.

The non-transparent ways in which some smart toys exchange data for algorithmic analysis with other AI-enabled devices, e.g., through Bluetooth connection, demonstrate how weak cyber security features can violate children's privacy and jeopardize safety. Smart toys can exacerbate children's exposure to online safety risks, such as "'content' risks (e.g. exposure to harmful or age-inappropriate material); 'contact' risks (e.g. exposure to unsolicited contact from adults); 'conduct' risks (e.g. cyberbullying); and 'contract' risks (e.g. data harvesting, commercial pressure and exhortations to gamble)" as defined by the 5Rights Foundation.

Legal mechanisms internationally govern the proliferation of AI-powered toys and the mitigation of such risks. The EU's Cyber Security Act poses minimum requirements for smart toy developers, which have to design toys that exhibit strong cybersecurity features when products are marketed in the EU. The European AI Act, which is expected to go into effect in 2024, introduces a four-tiered risk framework to evaluate artificial intelligence technologies and also requires smart toy innovators to assess the impact of their AI systems against ''reasonably foreseeable misuse". AI-enabled smart toys that are exploitative of children are outright banned by the regulation. The EU's Digital Services Act also protects youth by banning AI-enabled targeted advertising through profiling children. A U.S. federal law, the Children’s Online Privacy Protection Act (COPPA), sets age-appropriate access criteria and exchange of content on websites as a requirement. The FTC has recently also introduced a cyber security labeling programme to protect consumers from AI system breaches.

Protecting the interest of children and youth

Beyond laws, standards and policy frameworks are also effective in having a positive influence on smart toy developers. The IEEE Standard for an Age-Appropriate Digital Services Framework serves as a robust international means for smart toy companies to comply with. UNICEF's Policy Guidance on AI for Children offers recommendations for building AI policies and systems that uphold child rights. The World Economic Forum's AI for Children Toolkit is another means to encourage companies to innovate responsible AI-powered toys by better protecting the interests of children and youth. While there are laws and measures that govern the diffusion of smart toys, these may fall short to incentivize the innovation of child development oriented products.

UNICEF has provided several case studies on smart toys that display current best practices. Mattel has, for instance, collaborated with ToyTalk to release guidance that addresses parents' privacy concerns. Parents are enabled to access any recorded data on ToyTalk’s cloud account while being able to delete any information. ToyTalk also participates in the FTC’s KidSafe Seal Program, which is a compliance program for online services that target children.

Diverse AI-enabled toy innovations have, by now, also demonstrated how AI systems can impact children's development positively. NAO, an educational robot, is a smart toy example that teaches children by enabling them to spot and correct mistakes as part of their interactions and forms of learning. PlayShifu, a startup from India has grown since its establishment in 2016 into a global company of diverse, interactive smart toys which align with educational research, child development and parental needs. ROYBI Robot is a smart companion that introduces kids to technology, math, science, and languages such as English, Mandarin, and Spanish.

The Smart Toy Awards

The Smart Toy Awards, cohosted in 2022 by the World Economic Forum and the Dubai Future Foundation, exemplifies an active means to incentivize smart toy developers to prioritize children's developmental needs when designing AI-enabled toys across the globe. The Smart Toy Awards offer winners a unique competitive advantage by showcasing how smart toy manufacturers can go beyond legal compliance and offer children and parents safe, nurturing, and playful experiences.

Beyond relevant regulations, the judging criteria for the Smart Toy Awards also rests upon the FIRST principles of the AI for Children Toolkit. The toolkit encourages companies to demonstrate how their toys display fair, inclusive, responsible, safe and transparent characteristics, and it offers actionable guidance for companies, parents and leaders to better grasp the benefits and risks of AI-enabled toys.

Next to children's rights impact assessments, using the FIRST principles as a requirement for companies to enter the competition, the Smart Toy Awards manifests how the competition can trigger diverse child-friendly smart toy innovations. Winners of the Smart Toy Awards receive an honorary badge that can be used on their products, which may help parents better understand what it takes for a toy to display responsible AI behavior in interaction with children.

Innovation needs to respond to developmental needs and children's wellbeing

As the influence of AI-enabled smart toys on the youngest generation continues to broaden it is imperative that smart toy innovation is triggered by children's developmental needs. Companies committed to adopt children-centric governance models on a voluntary basis, will champion not only among AI innovators but most of all among members of the youngest generation and parents.

Don't miss any update on this topic

Create a free account and access your personalized content collection with our latest publications and analyses.

Sign up for free

License and Republishing

World Economic Forum articles may be republished in accordance with the Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International Public License, and in accordance with our Terms of Use.

The views expressed in this article are those of the author alone and not the World Economic Forum.

Stay up to date:

Artificial Intelligence

Share:
The Big Picture
Explore and monitor how Artificial Intelligence is affecting economies, industries and global issues
World Economic Forum logo

Forum Stories newsletter

Bringing you weekly curated insights and analysis on the global issues that matter.

Subscribe today

Here’s why it’s important to build long-term cryptographic resilience

Michele Mosca and Donna Dodson

December 20, 2024

How digital platforms and AI are empowering individual investors

About us

Engage with us

  • Sign in
  • Partner with us
  • Become a member
  • Sign up for our press releases
  • Subscribe to our newsletters
  • Contact us

Quick links

Language editions

Privacy Policy & Terms of Service

Sitemap

© 2024 World Economic Forum