How businesses should respond to the EU’s Artificial Intelligence Act
Businesses must comply with the EU's Artificial Intelligence Act or face hefty fines. Image: Freepik.
Listen to the article
- The EU’s Artificial Intelligence Act aims to regulate AI technologies.
- Failure to plan could leave some businesses at risk of non-compliance.
- Proving AI produces measurable value in your sector will be key.
The EU strikes again with a new set of regulations that take aim at the use of artificial intelligence (AI) to address the variety of risks associated with the societal adoption of AI. Like its sibling the General Data Protection Regulation (GDPR), the Artificial Intelligence Act (AIA) actually has teeth, with fines rising to €30 million, or 6% of global revenue. Is the answer to delete all your AI systems to minimize your risk to zero, or continue using AI for a competitive edge? Can you manage the recurring costs required to maintain compliance with the AIA even as the technology itself increases your bottomline?
Take the famous UK pub chain JD Wetherspoon, founded by British businessman Tim Martin in 1979 who has been an outspoken critic of the EU and a Brexit campaigner. Their response to personal identifiable information (PII) protection, legislated by the GDPR in 2017, was to delete their entire CRM database. Perhaps a drastic and knee-jerk reaction, but with the EU imposing GDPR-related fines of up to €20 million, or 4% of global revenue, my guess is JD Wetherspoon didn’t fancy taking on the risk, or investing a similar sum necessary to assure their GDPR compliance.
What is the aim of the Artificial Intelligence Act?
AI as a technology isn’t the problem; the human creators and business professionals require the intervention. The EU aims to create a globally recognized “factory” for producing safe, trusted, and ethical AI outcomes that respect existing laws on fundamental human rights and EU values. To enable the EU’s ethical AI mission, the AIA primarily aims to do two things:
- Identify AI systems that present unacceptable risk (e.g., from social scoring by governments to toys using voice commands that encourage dangerous behaviour).
- Apply strict obligations to AI systems that present high risk (e.g., from CV-sorting software for recruitment procedures, to credit scoring denying citizens the opportunity to obtain a loan).
Businesses that choose to build high-risk AI systems will be legally required to meet a defined list of criteria before they can be integrated into the single market. This means designing AI systems with transparency, explainability, and ethics embedded in their core, and having monitoring, guardrails, and governance capabilities already in place to ensure continued ethical compliance.
Many organizations have already implemented data protection and cybersecurity frameworks that provide a similar groundwork for AIA compliance. This makes meeting the requirements of the AIA not as challenging as starting compliance from scratch.
Complying with regulation that requires continuous risk evaluation and mitigation are standard parts of doing business in our economic reality. It’s not easy, but it should not be a giant leap for humankind. Successfully turning a prototype of an algorithmic design into an integrated version that can be easily mass-produced, and yields recurring value – while being able to clearly explain how that AI system positively impacts a dozen micro KPIs that all roll up into revenue gains, cost savings, and improved margins – now that’s hard.
To delete or not to delete?
The Center for Data Innovation’s analysis of The European Commission’s impact assessment concludes many things concerning the current AIA proposal, including:
- “The AIA will cause an additional 17% of overhead on all AI spending”. Therefore, is AI worth the additional 17% out of your pocket to maintaincompliance with the AIA?
- “A business with €10 million turnover would see its profits fall by 40%: Ouch!
For small to medium-sized businesses you might be thinking AI (and the associated AIA obligations) aren’t worth the trouble – I hear you. You could delete all of your AI like JD Wetherspoon did with their CRM and email-based customer marketing capabilities back in 2017. Problem solved, right? If something isn’t valuable to your business and shows no realistic competitive upside in the future, it makes sense to remove it.
How is the World Economic Forum ensuring the responsible use of technology?
There is mounting evidence that AI systems provide enough of a competitive advantage that it would be a lost opportunity for any business to not begin, or continue, investment into AI. What does this mean in terms of the AIA, and whether to consider deleting AI inventory as a fallback option? It provides the motivation to ask probing questions before your next move:
- Why do we need this?
- Why now?
- How does this AI system impact our KPIs: revenue, cost, and profit?
- How does this AI system impact society?
- What evidence is available that proves the AI system can deliver a more valuable and ethical outcome than a human, or can it augment ahuman to operate at a higher commercial and ethical level?
- What would happen if that AI system was switched off for the rest of the year?
- What advantage would the competition have if they had this AI system already in place?
Rachik Laouar, Head of Data Science at The Adecco Group UK, asked the right questions and directed his team to use a form of AI called automated machine learning to swiftly build an intelligently automated lead prioritisation system. His business stakeholders conservatively expect the solution to deliver £1 million within its first year. Within the first two months of the AI system’s implementation, 27% of that target has been achieved – how could you compete against Adecco without AI?
The Artificial Intelligence Act is an opportunity
I’ve witnessed, and helped create, too many success stories in the AI market to suggest that AI lacks value. I would never advise against implementing AI (or deleting it) as a fallback, but I agree that algorithmic solutions must improve upon their impact on society. Rather than there being a fallback, there is an opportunity to return to the drawing board.
How can you mitigate the risk of missing your industry’s peak AI value wave whilst balancing the need to prepare for the AIA?
- If you’ve yet to get started, find real evidence that AI is producing measurable value in your sector, and begin swiftly and rationally building the personalised case for AI in your business.
- If you’ve already begun an AI project, leave nothing to chance. You don’t want an executive decision maker getting cold feet because of the AIA’s penalty pressures and compliance costs. Start with recalculating the business value of your existing AI systems and switch on your AI ethical compass. Make the hard decisions without being told and if something is failing the business value assessment or AIA compliance assessment then rethink the approach.
The spirit of the AIA is to ensure regulated AI systems deliver ethical outcomes despite the inherent bias in human behaviour. There will always be some leaders that fail to embrace the opportunity these guidelines provide, to build commercially ethical AI systems that allow for competitive advantage, while protecting society against potential harm. For those that look to yield value from the AIA, they need only ask the right questions for the first time, or again.
Don't miss any update on this topic
Create a free account and access your personalized content collection with our latest publications and analyses.
License and Republishing
World Economic Forum articles may be republished in accordance with the Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International Public License, and in accordance with our Terms of Use.
The views expressed in this article are those of the author alone and not the World Economic Forum.
Stay up to date:
Artificial Intelligence
The Agenda Weekly
A weekly update of the most important issues driving the global agenda
You can unsubscribe at any time using the link in our emails. For more details, review our privacy policy.