How Energy-Expensive Is Artificial Intelligence?
The use of artificial intelligence has skyrocketed, leading to a surge in demand for electricity by data centres according to The BBC. Generative AI, in particular, is singled out as a major energy consumer due to its inefficient computational processes. This form of AI requires significant computational power with each query, significantly more than task-specific software.
As data centres continue to expand to meet this demand, concerns over energy consumption grow. The International Energy Agency predicts a doubling of electricity usage by data centres by 2026, equivalent to Japan’s annual consumption. The issue is particularly pressing in countries like Ireland, where data centres already consume a substantial portion of electricity.
Considering the significant energy implications, it’s only natural to ask: How energy-expensive is AI really?
The Energy Demands Of Artificial Intelligence
AI has become an integral part of our devices and media, powering everything from email automation to quirky chatbots. However, the energy consumed by these AI systems is raising questions.
Estimates suggest that generative AI, like the infamous GPT-3, can consume an astonishing amount of electricity — nearly 1,300-megawatt hours, equivalent to the annual power usage of 130 US homes, according to The Verge. To put this into perspective, streaming an hour of Netflix requires a fraction of that energy. However, precise figures are unclear due to the secretive nature of companies like Meta and Microsoft.
While training AI models is undeniably energy-intensive, the energy demands for deploying these models, known as “inference,” vary significantly.
The Verge reports that recent research by Sasha Luccioni and her colleagues sheds light on the energy costs of different AI tasks. For instance, generating text consumes relatively less energy compared to processing images, with some tasks requiring as much energy as charging a smartphone.
It should be noted that the energy efficiency of AI models can vary widely, and the lack of transparency from companies complicates efforts to gauge the true environmental impact of AI.
Why Does AI Use So Much Energy?
AI consumes a considerable amount of energy due to several factors. The training process, where AI models learn from vast amounts of data, is extremely intensive. Training large models like GPT-3 requires substantial computational power, which translates into high energy consumption.
Additionally, the complexity of AI algorithms contributes to their energy demands. These algorithms perform numerous calculations and optimisations, further increasing energy usage. Moreover, the deployment of AI systems for tasks like inference and image generation also requires significant computational resources, adding to the overall energy footprint.
The lack of efficiency in current AI hardware and software exacerbates energy consumption. As AI technologies continue to evolve and become more sophisticated, the demand for energy-efficient solutions becomes increasingly important to decrease the negative environmental impact of AI usage.
As AI continues to permeate various aspects of our lives, the energy demands it places on data centres raise valid concerns. With generative AI being particularly energy-intensive, the need for more efficient computational processes becomes evident. As data centres expand to accommodate this demand, projections indicate a distinct rise in electricity consumption, raising questions about sustainability. While advancements in energy-efficient AI models and hardware offer hope, understanding and addressing the overall energy impact of AI remains imperative for a sustainable future.