Forum Institutional

Why data equity plays a crucial role in establishing trust in the tech era

We need data equity to foster trust in the tech era

We need data equity to foster trust in the tech era Image: Getty Images

JoAnn Stonier
Mastercard Fellow, Data and Artificial Intelligence (AI), Mastercard
Lauren Woodman
CEO, DataKind
This article is part of: World Economic Forum Annual Meeting
  • The role of trust, ethics and responsible data practices are ever more important.
  • As data analytic tools, including AI and generative AI, use data to draw conclusions and create insights, society needs to create meaningful methods to address data inequity.
  • In a world that increasingly relies on technology, trusted technical design is fundamental and fundamental to that is data equity.

In our fast-paced data-driven world, the role of trust, ethics and responsible data practices have become ever more important. Yet, they are ever more debated as organizations struggle with the pressing need to innovate while protecting human rights; the importance of integrating data equity into this process becomes paramount.

All people deserve and are owed universal, indivisible and inalienable human rights in our digital age. We concur and advocate for embedding data equity as a fundamental principle to protect these rights. We believe by addressing data equity in the innovation process, we safeguard the rights and establish trust among all stakeholders.

Discover

How is the World Economic Forum creating guardrails for Artificial Intelligence?

Ways to ensure data equity

As methods, such as artificial intelligence and generative AI, use data to draw conclusions and create new insights, we must, as a society, create robust methods to address inequity. Data equity is an imperative that requires a series of actions to respond to the historical and current harms that have been caused by our data, analytic methods and carelessness in applying outputs to populations that have not always been reflected correctly in our data sets or analyses.

Many of the resultant harms have been inadvertent; the data may have been collected for unrelated purposes or with outdated or unreliable collection methods and fail to reflect that our societies have grown and changed over time. Unfortunately, the data used to perform analytics and innovation has not always caught up. This mismatch requires attention from data professionals and social scientists to create outcomes that are fair, just and consistent for all in our society.

To ensure trust and limit harm, two sets of actions are required. First, to achieve the desired outcomes that positively benefit and impact all populations well, social impacts must be understood as innovation goals are designed. With guidance from recipient populations, and an understanding of social norms in tandem with an appreciation of how different aspects of analytics will impact or ignore a specific population, a given innovation project can be designed to ensure equitable outcomes in the problems it is attempting to solve.

Have you read?

    Coupling social information with a data-driven review of inputs, including the quality, consistency, completeness and representation of data sets, as well as how data analytics and algorithms are structured, will then allow for better outputs of analytics, machine learning and artificial intelligence. Such a comprehensive approach will improve equity in outcomes.

    We know, for example, that individuals with disabilities may have been historically ignored in product design. With the right societal input from a given community, a product can be designed to meet the needs of those individuals who are differently abled. Once those product goals are understood, data professionals can ensure that adequate information is available to perform the analysis – both in the input information and the quality of that information and also in the analytic methods used. This will allow for more informative testing and more inclusive outcomes in the results.

    Creating new frameworks

    Achieving this vision requires changes from everyone involved in digital innovation. It requires the creation of new frameworks for digital design that ask the right questions upfront to ensure a more comprehensive design for the diverse individuals and communities that compose a given population that will be impacted by a potential product, solution or service. These frameworks will help define the types of data that are needed to ensure equity in the digital and technical design process. They, in turn, require methods to collect that data, while honouring individual privacy and the cultural values of various communities and providing requisite security for sensitive information.

    While challenging, this endeavour is crucial. It will, however, be valuable in changing the outcomes that individuals and marginalized communities experience today. It will mitigate the issues related to bias in machine learning and artificial intelligence and it will make all of us more mindful in the design process.

    By incorporating data equity into all design efforts, we enhance community trust in technology and ensure that diverse voices are represented as our digital landscape evolves so that communities and individuals will further trust the technical design process, thereby embedding trust in the application of technologies within society. They will be able to ensure that they are represented and understood as our digital world continues to evolve and change.

    Without proactive data equity, our world will continue to design for those that are already represented and foster misunderstanding, distrust and division as touted innovations fail to meet the needs of many. We will squander the chance to fully realize the benefits of technological progress for all and will continue on a path of creating greater division, less trust and a world that drives more misunderstanding and distrust. In a world that increasingly relies on technology, trusted technical design is fundamental. We believe that trust starts with data equity.

    Loading...
    Don't miss any update on this topic

    Create a free account and access your personalized content collection with our latest publications and analyses.

    Sign up for free

    License and Republishing

    World Economic Forum articles may be republished in accordance with the Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International Public License, and in accordance with our Terms of Use.

    The views expressed in this article are those of the author alone and not the World Economic Forum.

    Stay up to date:

    Emerging Technologies

    Related topics:
    Forum InstitutionalEmerging Technologies
    Share:
    The Big Picture
    Explore and monitor how Artificial Intelligence is affecting economies, industries and global issues
    A hand holding a looking glass by a lake
    Crowdsource Innovation
    Get involved with our crowdsourced digital platform to deliver impact at scale
    World Economic Forum logo
    Global Agenda

    The Agenda Weekly

    A weekly update of the most important issues driving the global agenda

    Subscribe today

    You can unsubscribe at any time using the link in our emails. For more details, review our privacy policy.

    What is Davos? 7 things to know about the World Economic Forum's Annual Meeting

    Gayle Markovitz and Spencer Feingold

    December 2, 2024

    What is the gig economy and what's the deal for gig workers?

    About us

    Engage with us

    • Sign in
    • Partner with us
    • Become a member
    • Sign up for our press releases
    • Subscribe to our newsletters
    • Contact us

    Quick links

    Language editions

    Privacy Policy & Terms of Service

    Sitemap

    © 2024 World Economic Forum