The AI boom has sparked a race to innovate, with companies pushing the boundaries of the technology to build more complex models that require significant computing power. But this relentless pursuit of progress comes at a huge cost: a surge in energy consumption fueled by data centers struggling to keep pace.
Big Tech players such as Google and Microsoft have revealed that their own emissions have climbed significantly in recent years, and projections show that global energy consumption for data centers could reach more than 1,000 TWh by 2026. When considering data centers already account for 2-4% of carbon emissions worldwide, the environmental impact of scaling AI is significant. The need for responsible AI is clear.
So how can we, as an industry, meet the growing demand for AI tools while remaining environmentally conscious?
Establishing ‘greener’ training models
Table of Contents
Training large-scale AI systems is extremely power-intensive. When looking at generative AI as an example, systems can consume 33 times more energy than machines running task-specific software. For instance, training ChatGPT-3 consumed an estimated 1.287 GWh of electricity alone – resulting in significant carbon emissions.
But it’s not all doom and gloom. The reality is AI hardware and software optimization is likely to make more difference to energy consumption levels than anything else. By altering their training models as well as developing and using energy efficient code, businesses can become more energy efficient and minimize their carbon footprint.
For example, our latest AI model tuning benchmarks demonstrated improvements in both throughput and latency, significantly enhancing performance and efficiency of a number of large language models, including LLaMA, Mistral, Mixtral, and Falcon. With up to 40% improvement in efficiency, this optimization is significant. It not only increases speed while reducing the costs associated with training AI models but, crucially, it can minimize energy consumption.
Open-source collaboration plays a key role here. Knowledge sharing and code sharing can enable developers to create more energy efficient tools, and to do so more quickly. An open-source approach will help the industry reduce the impact of AI on the environment in the short-term, while bringing more innovative thinking to long-term solutions.
Out with the old, in with the renewable
The surge in the use of AI applications has created an unprecedented demand for infrastructure, which is putting huge pressure on existing data centers – and power grids – around the world. Yet the development of new data centers can often be restricted due to their environmental impact.
This is why building and operating data centers in regions with a stable supply – if not an oversupply – of renewable power is crucial. Unlike legacy data centers built close to demand hotspots, we’re starting to see the emergence of data centers that rely on 100% renewable energy, which will be transformative from a sustainability perspective.
By targeting locations with an oversupply of renewable energy, industry players can ensure that the development of AI and high-performance computing (HPC) data centers do not strain existing power grids. This provides sustainable and efficient operations, as deploying AI infrastructure clusters in these areas allows for the utilization of energy that would otherwise be wasted.
Avoiding the AI sinkhole
The desire to innovate is central to any successful business. And yet, it’s important to strike the balance between this and ensuring AI is used with real purpose behind it.
Research from the Office for National Statistics (ONS) shows that last year, one in six UK businesses had embraced at least one AI tool. An issue – which has become particularly apparent after the launch of ChatGPT – is that the technology is often being used unnecessarily. And when looking at this with a sustainable lens, we can see that businesses shouldn’t be using AI just because they can.
What we, as an industry, should be asking is: do we have a well thought out AI strategy? Is the business using AI in a way that drives real business returns? Are there other, more energy efficient means that can be implemented? And when the technology is utilized, is my business working with the most eco-friendly AI compute providers? Each of these questions requires a level of personal, and business-wide, accountability. But responsible AI use is crucial if we are to avoid an “AI sinkhole” and minimize wasted resources.
A sustainable AI future
Concerns around the environmental impact of AI are important for the industry to address – and to do so quickly. Although estimates around energy consumption are stark, there are tangible, achievable steps that can be taken to create a more sustainable AI future. We can train AI models more efficiently, we can use data centers powered by renewable energy, and – perhaps most crucially – we can implement best practices around responsible AI use in the workplace.
By doing so, we can embrace the exciting progress of AI and unlock its full potential, without our environment paying the price.
We list the best bare metal hosting.
This article was produced as part of TechRadarPro’s Expert Insights channel where we feature the best and brightest minds in the technology industry today. The views expressed here are those of the author and are not necessarily those of TechRadarPro or Future plc. If you are interested in contributing find out more here: