Forum Institutional

4 pillars for using AI responsibly in a skill-based organization

Using AI responsibly can help uncover skills.

Using AI responsibly can help uncover skills. Image: Photo by Karolina Grabowska

Jeroen Van Hautte
Co-Founder and CTO, TechWolf
Andreas De Neve
Chief Executive Officer, TechWolf
This article is part of: World Economic Forum Annual Meeting

Listen to the article

  • As the world moves towards skill-based workforces more companies are drawing on AI to support them with this.
  • This makes it crucial for AI to be used responsibly: without clear oversight, systemic biases will worsen in workforces and employee trust will be lost.
  • By following four key pillars for using AI responsibly, skill-based organizations can ensure a transparent approach that benefits employees and employers.

Jobs are changing too quickly for traditional workforce models to keep up. This has given rise to a new term, the 'skill-based organization.' Studies indicate that 90% of business executives are now experimenting with building a skill-based organization.

Moving from jobs to skills is complex. It involves unravelling roles to a granular, skill-based level. That takes time and there’s potentially missing data if an HR or learning system hasn’t been updated in a while. Plus, the number of skills that an organization has is approximately twice the number of roles and jobs.

Organizations have realized that keeping track of dozens of continuously evolving skills across thousands of workers is no feasible manual task. So they turn to AI. This is a step in the right direction and the only way we can truly embrace the potential of the skill-based organization. But, we must tread carefully.

When people’s livelihoods are on the line, the AI used to influence those lives must be as open, transparent and ethical as possible. Tales of AI-gone-bad are, unfortunately, all too common. Amazon had to cancel a recruitment AI model due to bias against women. More recently, concerns have been raised that hiring and screening algorithms discriminate against people with disabilities.

Key pillars to using AI responsibly

It’s vital that AI is used for good in the skill-based organization, or all of the effort put into shifting away from jobs will be for nothing. Forming the basis of this approach are four key pillars for using AI responsibly:

Four key pillars for using AI responsibly.
Four key pillars for using AI responsibly. Image: Andreas De Neve

Pillar 1: Know your data sources

An AI tool is only as good as the data sources it works with. Give it inaccurate data and you’ll get inaccurate results. AI, working at scale in your organization, can drastically increase bias if it is modelled on biased or incomplete data. Before you start giving your AI skills data to train on, you must audit your data to ensure it's as accurate and representative as possible.

Ultimately, not all of your data will be objective or equitable. For example, in our work at TechWolf, we’ve discovered that male employees tend to over-report their skills, whereas female employees tend to under-report them. If you fail to understand such contextual factors when using skills data, you’ll find that your system excludes important talent. Undermining the entire point of the skill-based organization.

When evaluating data sources, organizations should keep four properties in mind:

Properties for evaluating data sources for using AI responsibly.
Properties for evaluating data sources for using AI responsibly. Image: Andreas De Neve

A final note on this, we mandate using non-invasive data sources to avoid infringing on employee privacy. Avoid using invasive data sources such as email, private chat or any other data sources that could be seen as prying. A good check for this is to ask yourself if you would be comfortable if your manager had access to the data sources you’re intending to use — your emails, your Slack messages and so forth.

Have you read?

    Pillar 2: AI’s decisions are explainable

    Robert Jones went down in history when he drove off of a cliff edge while blindly following his sat-nav’s instructions. When acting on an AI’s recommendations, you must understand how it has come to its conclusion. Algorithms that can explain their reasoning are more trusted and suffer from less bias.

    That’s why there’s a host of legislation aimed at enforcing a certain level of explainability. In the EU, GDPR provides employees with the "right to an explanation” when algorithms measure or evaluate aspects related to them based on automated data processing. The AI Act is also making its way through the European Parliament. In the US, the National Institute of Standards and Technology has published its Four Principles of Explainable Artificial Intelligence.

    Discover

    How is the World Economic Forum ensuring the responsible use of technology?

    Pillar 3: Employees own their data

    Employees generate data throughout the work day, with every document written, project finished and course completed. Ideally, the data generated and skills inferred from this will be portable across organizations and used by individuals to grow their careers. But at the very least, individuals must have the final say on whether skills are part of their profile or not — and if AI can use it.

    Key to this is highlighting (and delivering on) the benefits of using skills for workforce decisions. Taking this approach suits employees, with 79% stating that they’d be okay with having their skills data collected by employers and a further 14% open to it, depending on the purpose.

    Pillar 4: Share the uses and benefits

    Employees are open to sharing their skills data with their employers, but many only wish to do so if the benefits to them are clear. If they get fairer hiring, tailored work experiences and growth opportunities, they are happy to share data. Additionally, let employees know how their data is collected, from where and how the AI uses it. You might need to do a bit of upskilling around AI to ensure all employees understand these points.

    Workers are willing to share their data for AI if they get benefits in exchange
    Workers are willing to share their data for AI if they get benefits in exchange Image: Deloitte Skills-Based Organization Survey, May-June 2022

    The four pillars: holding up the skill-based organization

    The world is moving towards a skill-based approach and that increases AI’s role in workforce decision-making. It’s not a matter of if, but when AI comes to the skill-based organization. Set the right foundations by integrating the four pillars into your AI strategy. Without them, everything will come crumbling down.

    Don't miss any update on this topic

    Create a free account and access your personalized content collection with our latest publications and analyses.

    Sign up for free

    License and Republishing

    World Economic Forum articles may be republished in accordance with the Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International Public License, and in accordance with our Terms of Use.

    The views expressed in this article are those of the author alone and not the World Economic Forum.

    Stay up to date:

    Artificial Intelligence

    Related topics:
    Forum InstitutionalEmerging TechnologiesJobs and the Future of Work
    Share:
    The Big Picture
    Explore and monitor how Artificial Intelligence is affecting economies, industries and global issues
    World Economic Forum logo

    Forum Stories newsletter

    Bringing you weekly curated insights and analysis on the global issues that matter.

    Subscribe today

    Davos 2025: How to follow the Annual Meeting on our digital channels

    Beatrice Di Caro

    December 17, 2024

    The other 51 weeks: what happens before and after Davos?

    About us

    Engage with us

    • Sign in
    • Partner with us
    • Become a member
    • Sign up for our press releases
    • Subscribe to our newsletters
    • Contact us

    Quick links

    Language editions

    Privacy Policy & Terms of Service

    Sitemap

    © 2024 World Economic Forum