How Edge Devices Can Help Mitigate the Global Environmental Cost of Generative AI
How Edge Devices Can Help Mitigate the Global Environmental Cost of Generative AI
NORTHAMPTON, MA / ACCESSWIRE / May 8, 2024 / Qualcomm
Exploring the role of edge devices in reducing energy consumption and promoting sustainability in AI systems
Written by Angela Baker
The economic value of generative artificial intelligence (AI) to the world is immense. Research from McKinsey estimates that generative AI could add the equivalent of $2.6 trillion to $4.4 trillion annually.1
But the energy cost of AI and its environmental impact can also be extensive unless our technology approach evolves to effectively tackle these challenges.
Current projections vary, but there are startling analyses of generative AI’s energy use and its impact on the environment. A peer-reviewed report in Joule projects AI energy use growing to over 85 terawatt-hours annually, more than the usage of many small countries (Ireland is the example given).2 Popular studies, like those from Gartner, paint dire pictures of the environmental impact and the expense of adapting our computing infrastructure to generative AI. Gartner’s report predicts that by 2030, AI could consume up to 3.5% of the world’s electricity.3
Additionally, data center processing requires cooling, and cooling consumes water. In its latest environmental report, Microsoft disclosed that its global water consumption spiked 34% from 2021 to 2022, an increase that outside researchers tie to AI.
It is imperative to find ways to make AI processing more energy efficient and sustainable. But most reports focus almost entirely on the energy used by AI in the cloud and in data centers.
Increasing Efficiency by Running AI Models in Devices
Generative AI does not have to run exclusively in the cloud.
Currently, training a consumer-grade generative AI model requires a massive cluster of AI hardware and the power to run it. A researcher at the University of Washington estimated that training a model like ChatGPT-3 could use up to 10 gigawatt-hours, roughly equivalent to the annual energy consumption of 1,000 U.S. households.4
But once an AI model is trained, it can be reduced and optimized to run on a significantly less power-hungry piece of hardware, like a smartphone or battery-powered laptop.
For instance, research by analyst firm Creative Strategies5 concludes that Snapdragon 8 Gen 3, a flagship processor for smartphones, is 30 times more efficient than a data center on image generation tasks. For laptop PCs, the same report states that Snapdragon X Elite Compute Platform is nearly 28 times more efficient than running the AI task on the data center.
Running AI on local, private devices also saves the expense of sending queries and data across the network (and through the internal data-routing systems at a cloud provider) and sending the answers back.
Finally, the limited processing power of local devices, compared to massive resources available to run queries in cloud data centers, enforces a form of AI discipline on AI software companies, app developers and users. Not all generative AI queries require the resources of cloud-based ChatGPT-4 or its equivalent. By reallocating a portion of AI tasks to the edge, we can leverage the benefits of on-device AI processing, which offers efficient computations with minimal power consumption. A balanced strategy, involving a deliberate distribution of AI workloads across the cloud and edge, can enhance performance efficiency and minimize energy consumption.
As technology providers begin to distribute generative AI capabilities to personal devices and start to gather data on the economics of various query types and where those queries run, we expect that they will start to surface these calculations for users, allowing people to make individual cost-based decisions about how much AI processing power they consume.
Taking Efficient Edge AI Technologies to the Cloud
The technology that enables edge devices, such as smartphones and tablets, has evolved to be both powerful and power-efficient. Users expect these devices to be fast, responsive, and capable of lasting a full day on a single battery charge.
In fact, modern smartphones have surpassed the power of IBM’s Deep Blue supercomputer, which gained fame for defeating chess grandmaster Garry Kasparov in 1997.6 What’s even more impressive is that these powerful mobile devices consume significantly less energy than an LED light bulb.7
This remarkable energy efficiency is the result of decades of innovation in the field. Lean computing instruction sets have been developed to process data using fewer operations, while systems-on-chip integrate multiple components into a single chip to reduce power consumption.
Such innovations have allowed Qualcomm Technologies, Inc. to deliver record-breaking power-efficient cloud AI processing products. This showcases the significant potential of edge technologies in addressing the energy challenge associated with processing AI models in the cloud.
AI might mitigate its own efficiency problems
AI tools are well-suited to optimizing complex systems and can be used to reduce energy requirements and environmental impacts.8 There’s a possibility that AI tools can help offset some of the impacts of human-caused climate change. Research from the Boston Consulting Group says that, “AI can accelerate climate action by taking climate modeling to the next level, enabling new approaches to climate education, and supporting breakthroughs in climate science, climate economics, and fundamental research.”9
At the moment, the world is still gathering data on the environmental costs and benefits of our new AI tools. The recent COP 28 – UN Climate Change conference highlighted these data gaps to an extent. We are heartened that progress is being made and that AI can help; as Microsoft’s Brad Smith put it, “You can’t fix what you can’t measure, and these new AI and data tools will allow nations to measure emissions far better than they can today.”10
In the meantime, it’s imperative that we get a better handle on AI’s energy use itself, and do what we can to reduce that use – when possible – by running AI models on lower-power devices, and by using models that simply require less power.
At Qualcomm, we believe that AI systems should be designed, developed and deployed in a way that is mindful of environmental impact throughout their lifecycle and value chain. We are working to allow edge devices to run AI processing more efficiently, with leading performance per watt.
Learn more about Qualcomm’s responsible AI principles
Read about Qualcomm’s sustainability efforts
Opinions expressed in the content posted here are the personal opinions of the original authors, and do not necessarily reflect those of Qualcomm Incorporated or its subsidiaries (“Qualcomm”). The content is provided for informational purposes only and is not meant to be an endorsement or representation by Qualcomm or any other party. This site may also provide links or references to non-Qualcomm sites and resources. Qualcomm makes no representations, warranties, or other commitments whatsoever about any non-Qualcomm sites or third-party resources that may be referenced, accessible from, or linked to this site.
Snapdragon is a product of Qualcomm Technologies, Inc. and/or its subsidiaries.
References:
1: McKinsey & Company. (Jun 14, 2023). The economic potential of generative AI: The next productivity frontier. Retrieved on Jan 12, 2024 from: https://www.mckinsey.com/capabilities/mckinsey-digital/our-insights/the-economic-potential-of-generative-ai-the-next-productivity-frontier#introduction
2: Alex de Vries. (Oct 10, 2023). The growing energy footprint of artificial intelligence. Retrieved on Jan 12, 2024 from: https://www.cell.com/joule/abstract/S2542-4351(23)00365-3
3: Gartner. (Nov 7, 2023). Gartner says CIOs must balance the environmental promises and risks of AI. Retrieved on Jan 12, 2024 from: https://www.gartner.com/en/newsroom/press-releases/2023-11-07-gartner-says-cios-must-balance-the-environmental-promises-and-risks-of-ai
4: University of Washington. (Jul 27, 2023). Q&A: UW researcher discusses just how much energy ChatGPT uses. Retrieved on Jan 12, 2024 from: https://www.washington.edu/news/2023/07/27/how-much-energy-does-chatgpt-use/
5: Max Weinbach, Creative Strategies. (Apr 3, 2024). The Power of Efficiency: Edge Al’s Role in Sustainable Generative Al Adoption. Retrieved on Apr 11, 2024 from: https://creativestrategies.com/research/gen-ai-edge-testing/
6: ZME Science. (May 11, 2023). Your smartphone is millions of times more powerful than the Apollo 11 guidance computers. Retrieved on Apr 8, 2024 from: https://www.zmescience.com/feature-post/technology-articles/computer-science/smartphone-power-compared-to-apollo-432/
7: TechFow. (Nov 17, 2022). How Many Watts Does a Phone Use Per Hour. Retrieved on Apr 8, 2024 from: https://www.techfow.com/how-many-watts-does-a-phone-use-per-hour-beginners-guide/
8: Earth.org. (Sep 27, 2023). 7 Ways AI can support energy conservation efforts. Retrieved on Jan 12, 2024 from: https://earth.org/7-ways-ai-can-support-energy-conservation-efforts/
9: Boston Consulting Group. (Nov 20, 2023). How AI can speed climate action. Retrieved on Jan 12, 2024 from: https://www.bcg.com/publications/2023/how-ai-can-speedup-climate-action
10: Microsoft. (Nov 29, 2023). UNFCCC partners with Microsoft to use AI and advanced data technology to track global carbon emissions and assess progress under the Paris Agreement. Retrieved on Jan 12, 2024 from: https://news.microsoft.com/2023/11/29/unfccc-partners-with-microsoft-to-use-ai-and-advanced-data-technology-to-track-global-carbon-emissions-and-assess-progress-under-the-paris-agreement/
View additional multimedia and more ESG storytelling from Qualcomm on 3blmedia.com.
Contact Info:
Spokesperson: Qualcomm
Website: https://www.3blmedia.com/profiles/qualcomm
Email: info@3blmedia.com
SOURCE: Qualcomm
View the original press release on accesswire.com