Posted inEmergent Tech

How organisations can walk the tightrope of GenAI’s energy appetite vs sustainability goals 

While generative AI (GenAI) is fast picking up, there are serious concerns on the energy consumption it has.

Small countries use over 85.4 Terawatt-hours of electricity annually. But if we continue the current trends in AI capacity and adoption, this same amount of electricity could be consumed by generative AI (genAI), a type of AI that can create new content, according to research data conducted by Scientific American and Joule Magazine.   

All this, added to the data centres accounting for over 1.5 per cent of the global electricity used, as per the International Energy Agency, could put a crimp on those looking for a balance between AI and sustainability, especially in the MENA region.  

But like all things AI and technology, this has a double-edged sword element.   

GenAI’s energy consumption is alarming, and AI and tech solutions can help companies decrease waste and save resources with accurate mathematical calculations.  

“There aren’t any details on how much energy will continue to be consumed. However, organisations must develop an energy strategy. Currently, building and using large language models (LLMs) is expensive. Still, by developing an energy strategy, organisations can control costs and extract more value from data as it grows,” said Fred Lherault, CTO Emerging, Pure Storage.  

Lherault explained that smart distribution units and storage arrays, which are advanced energy management systems, help. These systems provide a precise measurement of energy consumption. “With this, companies can reduce server power usage and store data in powerful storage systems. With the growing use of AI, the consumption, and information overload, there is a need for efficient and sustainable systems.”

He added that it also eliminates the need for several internal storage devices, reducing power consumption by up to 85 per cent on a terabyte basis. PureStorage, for example, is already doing this.  

While AI is a powerful tool, we acknowledge concerns about its energy consumption. However, our focus on automation through AI solutions like demand forecasting is a testament to its potential to reduce this footprint. By optimising operations and potentially streamlining the need for multiple AI models, we can contribute to a more sustainable F&B ecosystem. Furthermore, our cloud-based platform leverages data centres increasingly adopting renewable energy sources. This ensures a greener infrastructure to power our solutions,” commented Alex Ponomarev, CEO of Syrve MENA.  

 As researchers explained, there are two big phases when it comes to AI. One is the training phase, where the model is set up and taught how to behave itself. This is followed by the inference phase, where the model is put into live operation and fed prompts so that it can produce original responses. The inference phase is particularly energy-intensive as it involves real-time processing and decision-making.   

“There is a complex interplay of trade-offs as we don’t have a full picture. Alternative energy sources are becoming drastically cheaper, and thanks to AI, the go-to-market for storage and distribution tech is contracting. However, AI energy demand may double by 2026,” added Amina Musaeva, founder of Cloudset.  

She added that AI can consume more energy and make achieving results faster and more efficient. So, this energy demand could be over-compensated by the energy savings on a consumer’s end.  

However, information on AI training is becoming more secretive, and these models are becoming more complex. The other problem is hype, which induces us to throw more computing and data when the hardware becomes more efficient.  

Some may claim that superior autonomous intelligence would ultimately resolve and offset this energy waste, a classic techno-centric argument in a dilemma of progress versus climate change.   

“A crucial starting point is to establish transparent reporting on the AI energy demands, create universal metrics for monitoring the offsetting and efficiency gains, and potentially introduce some energy capping and kWh trading strategies. This will ensure that only those with the highest need and greatest efficiency potential obtain the most quotas,” added Musaeva, emphasising the importance of transparency in AI energy consumption.   

New AI models are developed that can count on the energy factor. CLOVER is a specific AI model that adjusts its size based on the task. It determines what a user tries to do and selects only as big a model as that task truly needs.   

“The team reported that CLOVER can cut the greenhouse gas emissions of AI used at a data centre by more than 75 per cent. And with those savings, the accuracy of AI models’ results drops by only 2 to 4 per cent. Tech companies can route their data calculations to places and data places and data centres mostly powered by renewable sources. However, the supply is still too limited,” explained Musaeva.   

While the amount of energy AI models will consume is yet to be determined, many believe regulators should start requiring energy use discourses from AI developers, as there is limited data.