How organizations can adopt AI without expanding their carbon footprint
How can we benefit from growth the AI revolution will enable, without also growing computing's carbon footprint? Image: WEF/iStockphoto
Listen to the article
- The explosion of AI across domains will only further accelerate the demand for computing power and energy.
- The answer is for us to develop sustainable computing practices, and it has five components.
- Each of these steps helps tackle the sustainable computing challenge in different ways but the impact of bringing it all together will be significant.
Computing technologies continue to advance at a breathtaking pace, powering the future of innovation by driving scientific research, engineering breakthroughs, and new digital services.
Computing’s massive role in driving innovation and commerce is accompanied by an equally significant carbon footprint. The Uptime Institute, for example, reports that server power consumption has increased by 266% since 2017. And data centres now account for 3% of global electricity consumption, likely reaching 4% by 2030.
The explosion of artificial intelligence (AI) across domains will only further accelerate the demand for computing power and energy. How can we benefit from the economic growth that the AI revolution will enable, without also growing the carbon footprint of computing?
The answer is for us to develop sustainable computing practices, and it has five components: improve utilization via cloud computing, develop domain-specific architectures, enable workload portability, automate performance optimization, and use sustainable energy sources. Each of these steps helps us tackle the sustainable computing challenge in very different ways, but the impact of bringing all this together will be significant.
Improve server utilization with cloud computing
Whether computing servers are running AI workloads from a corporate or cloud provider data centre, higher utilization means we are running less servers, and therefore using less power. Utilization estimates vary, but it is common for corporate data centres to run at low utilization, from 10-50% utilization. Cloud providers, on the other hand, are known to maintain utilization rates of 50-80%. Cloud providers can do this because they are aggregating demand at a much broader scale, and can capture hyper-scale efficiencies.
But moving workloads to the cloud can be a complex task. It takes time, money and expertise to ensure it is done right. With automation, we can improve access to the cloud for more organizations, and improve overall utilization worldwide.
Acceleration via domain-specific architectures
The days of general purpose CPUs (central processing unit) computer chips are over. These days, computing advancements are happening with domain-specific architectures like graphics processors units (GPUs), tensor processors units (TPUs), and other variants.
As the term suggests, domain-specific architectures are optimized to run specific workloads, and when they do, performance gains are impressive. And as a result, the variety of semiconductors has exploded. Over the past 10 years, the performance gains through specialized computing chips has increased over 1000x relative to general purpose CPU architectures. And in 2022 alone, almost 400 new hardware options were introduced by cloud service providers.
And there is promise for even more efficient chips for computing, networking and memory. Semiconductor makers need to keep working to find even more energy-efficient chips, possibly tapping into technology used in low-power mobile devices.
Enable workload portability through automation
While a wide choice of greener chip options is available, it doesn’t mean organizations can easily take advantage of them. The difficulties and costs of switching out servers or cloud services means organizations can’t easily move from one hardware architecture to another.
Given the economics of on-premises data centres, hardware investments require at least a three-year period. So with on-premises, organizations are almost immediately burdened with using less-efficient chips than the latest and greatest ones available via the cloud.
But even with cloud services, it is very difficult and complex to port applications from one cloud infrastructure to another. This is inhibiting adoption of best-fit architectures for greener computing.
To address this, organizations need new ways to automate all the steps required to move applications to different cloud services.
Automated assessment of chip efficiencies
Being able to easily switch to different cloud-based infrastructures isn’t helpful if organizations don’t know the right ones to use for specific applications (and even specific types of compute workloads).
This process of matching the best-fit hardware for an application requires automation. Things are simply changing too quickly for organizations to constantly test, benchmark and then adopt new hardware.
And getting this wrong can unnecessarily use magnitudes more power consumption. For example, new GPUs are much more efficient than traditional CPUs for running deep-learning neural networks because they can process multiple parallel tasks up to three times faster than a CPU.
Cleaner power for data centres
Finally, beyond helping organizations take best advantage of growing hardware options to lower their carbon footprint, the industry needs to keep pushing to develop cleaner, more sustainable cloud computing data centres.
This includes pushing for innovative new approaches, such negative-carbon solutions combining clean energy sources, as well as capturing data centre heat to reuse as a supplementary energy source.
Incentives, tax credits, and other tools can further encourage such innovations for cleaner, more energy-efficient data centres.
How is the World Economic Forum ensuring the responsible use of technology?
The cloud: just the start to greener computing
While the demand for computing power will march on, organizations are no longer tied to the old computing paradigms of on-premises data centres. Fortunately, the cloud now provides the options they need to make the best choices for powering AI and their other computing needs.
But there is more work to do to make it even easier for organizations to use greener alternatives for their computing needs. Incentives can certainly help, but it will be crucial to automate how organizations identify and adopt the greenest computing options.
With these capabilities in place, organizations can move forward to harness AI to address the world’s biggest innovation challenges while ensuring more sustainable computing.
At Rescale, we are committed to developing the ecosystem, technologies and capabilities to support organizations in achieving their sustainable computing goals.
Don't miss any update on this topic
Create a free account and access your personalized content collection with our latest publications and analyses.
License and Republishing
World Economic Forum articles may be republished in accordance with the Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International Public License, and in accordance with our Terms of Use.
The views expressed in this article are those of the author alone and not the World Economic Forum.
Stay up to date:
Future of Computing
Related topics:
The Agenda Weekly
A weekly update of the most important issues driving the global agenda
You can unsubscribe at any time using the link in our emails. For more details, review our privacy policy.
More on Forum InstitutionalSee all
Gayle Markovitz and Vesselina Stefanova Ratcheva
November 21, 2024