4 questions parents should ask about educational tech during COVID-19
Image: REUTERS/Jennifer Lorenzini
- About 1.5 billion children globally are out of school due to COVID-19.
- Many parents are turning to online education technology, but is it safe?
- Here are four questions to ask, including: does the technology prioritize privacy?
With 1.5 billion children globally out of school due to COVID-19, many parents with access to technology and internet are increasingly turning to online education technology, smart toys, and video games to keep their kids learning at home. Kids are using Zoom for classes and video calls, YouTube for education and leisure, online EdTech to learn, and videogames for entertainment.
Yet little governance is in place to regulate these tools and services. When children begin to use an online learning tool or smart toy at home, their parent or guardian consents to the Terms of Service, but few parents read the Terms of Use in detail, and even fewer have the time and legal or technical knowledge required to understand the lengthy document.
With children and youth extremely vulnerable to risks posed by technology, ethics and governance are urgently required during and following the COVID-19 pandemic. When deciding whether technology is safe and educational for their children, parents and guardians should ask four questions about education, safety and privacy, responsible use, and inclusion and fairness.
What is the World Economic Forum doing about the coronavirus outbreak?
1. Does the technology have a strong educational foundation and encourage creativity?
The technology should have a clear pedagogical foundation for teaching children and provide data to demonstrate its educational value and impact. Not every EdTech product like the app Bedtime Math will have a peer-reviewed study to support it, but they should provide quantitative analyses of their impact. Parents can also look for research-based evaluations from third parties, like Common Sense Media.
Simply teaching skills is not enough: The technology should also encourage creativity and independent thought. Technology can enhance and not hinder creativity, and students who have creativity incorporated into their curricula have better learning outcomes, according to a Gallup Education study. EdTech should leverage technology to promote creativity and critical thinking, and not limit children to think within the constraints of a program or game.
2. Does the technology protect the child, prioritize privacy, and safely store the child’s data?
The technology should have clear safety policies in place to protect children from potential bullying, harassment, exploitation or other security risks. If children can communicate with other users on the platform or within a game, they are extremely vulnerable, and parents might find it challenging to track their online activity and communications.
The technology should also have a well-defined privacy policy and allow parents and guardians to opt-in to data collection for children and youth - both younger and older than 13. The technology should exceed the requirements of the Children’s Online Privacy Protection Act (COPPA), which gives parents control over information collected about their children. COPPA requires websites to provide privacy and information policies to parents and receive their verifiable consent before collecting personal information about their children younger than 13. Common Sense Media provides privacy ratings for education apps and websites, and curates a list of resources with “solid privacy policies.”
If parents choose to allow the technology to gather data, it should have clear safeguards in place to protect the data and anonymize it for internal use and to prevent potential hackers from identifying children if the data is stolen. Parents should also check to ensure that the technology does not sell children’s data to third parties unless parents provide consent, which they should carefully consider with the understanding that personal data is sold widely within the private sector.
3. Is the technology designed for responsible use and to prevent addiction?
Many technologies are designed to maximize use, and many smart toys and games are inherently addictive. The technology should have limits in place to discourage children from overuse. Parents should encourage children to moderate their use of technology and should lead by example. Children ages 2 to 5 should spend no more than one hour of screen time per day and parents of children over 5 should have “consistent” limits on screen time, according to the Academy of American Pediatrics.
But COVID-19 makes these guidelines challenging—if not impossible—for many families. Even UNICEF is rethinking their screen time guidelines. Jenny Radesky, a professor of pediatrics at the University of Michigan and author of the AAP guidelines, tweeted AAP’s current advice during COVID, which included, “Challenge your children to practice ‘tech self-control’ and turn off tech themselves.” But as a mother herself, she admits that during COVID this advice is challenging and explains, “I’m making this up as I go, too!”
4. Is the technology inclusive, fair and unbiased?
The technology should make clear that it is designed for a broad base of diverse children to use. Some technologies are designed for a specific child consumer in mind, but all technologies should be designed to promote accessibility for all potential users regardless of ability, language, or potential visual, auditory, or other impairments.
If the technology uses artificial intelligence (AI) or machine learning such as facial recognition, parents must also ensure that it is fair and unbiased. Many AI models struggle with bias against certain groups. The technology should make clear that it treats all children fairly and prevents bias or discrimination based on age, gender identity, ethnicity, or any other demographic characteristics.
Looking forward
Although some of these questions are challenging, they are worth asking. When parents do, they will find not all technologies or smart toys claiming to be educational actually teach their kids or encourage creativity. They’ll also find their children and their data are not always well protected and treated fairly.
Many excellent EdTech tools and smart toys exist that are both responsible and educational. Numerous organizations like Common Sense Media, YouTube’s Learn@Home, and the UK Department for Education have curated free online resources for families and children during the pandemic.
Yet it’s imperative that parents use these media and technologies to supplement their children’s education - not replace virtual learning from children’s schools nor to be used as a crutch to prevent boredom at home.
We need companies, non-profit organizations and governments to agree on principles for technology used by children that consider education, safety and privacy, responsible use, and inclusion and fairness. Only by doing so can we begin to ensure that technology designed for children is responsible, educational and protects them from harm.
COVID-19 presents an unprecedented challenge to parents, teachers and students around the world. Technology provides enormous promise for education, but only with a rigorous focus on responsibility and ethics can we ensure its long-term impact for children and future generations is positive.
Don't miss any update on this topic
Create a free account and access your personalized content collection with our latest publications and analyses.
License and Republishing
World Economic Forum articles may be republished in accordance with the Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International Public License, and in accordance with our Terms of Use.
The views expressed in this article are those of the author alone and not the World Economic Forum.
Stay up to date:
Data Science
Related topics:
The Agenda Weekly
A weekly update of the most important issues driving the global agenda
You can unsubscribe at any time using the link in our emails. For more details, review our privacy policy.
More on Health and Healthcare SystemsSee all
Michelle Meineke
November 22, 2024