In this article

This article was written and contributed by Intel. 

We are witnessing a historic, global paradigm shift driven by dramatic improvements in artificial intelligence. With the evolution from predictive to generative AI, more businesses are taking notice, with enterprise adoption of AI more than doubling since 2017[JH1] .  According to McKinsey, 63 percent of respondents expect their organizations' investment in AI to increase over the next three years. 

Paralleling the unprecedented adoption of AI, the volume of compute is also increasing at a stunning rate. Since 2012, the amount of compute used in the largest AI training runs has grown by more than 300,000x[JH2] . As sizable computing demands grow, significant environmental implications follow. 

More compute leads to greater electricity consumption, and consequent carbon emissions. A 2019 study by researchers at the University of Massachusetts Amherst estimated that the electricity consumed during the training of a transformer, a type of deep-learning algorithm, can emit more than 626,000 pounds (~284 metric tons) of carbon dioxide—equal to the lifetime emissions of five cars[JH3] [JH4] . And that's just the training the model. 

This current trajectory for intensifying AI with an ever-growing environmental footprint is simply not sustainable. We need to rethink the status quo and change our strategies and behavior.

Driving sustainable improvements with AI

While there are undoubtedly serious carbon emissions implications with the increased prominence of AI, there are also enormous opportunities. Real-time data collection combined with AI can help businesses quickly identify areas for operational improvement to help reduce carbon emissions at a scale. 

For example, AI models can identify immediate improvement opportunities for factors influencing building efficiency, including heating, ventilation, and air conditioning (HVAC). As a complex, data-rich, multi-variable system, HVAC is well suited to automated optimization, and improvements can lead to energy savings within just a few months. While this opportunity exists in almost any building, it's especially useful in data centers. Several years ago, Google shared [JH5] how implementing AI to improve data center cooling reduced their energy consumption by up to 40 percent.

AI is also proving effective for implementing carbon-aware computing. Automatically shifting computing tasks selectively based on the availability of renewable energy sources can lower the carbon footprint of the activity. 

Likewise, AI can help diminish the ballooning data storage problem previously mentioned. To address the sustainability concerns of large-scale data storage, Gerry McGovern, in his book 'World Wide Waste[JH6] ,' recognized that up to 90 percent of data is unused—merely stored. AI can help determine what data is valuable, necessary, and of high enough quality to warrant storage. Superfluous data can simply be discarded, saving both cost and energy. 

How to design AI projects more sustainably

To responsibly implement AI initiatives, we all need to rethink a few things and take a more planful approach.

Begin with a critical examination of the business problem you are trying to solve. Ask: do I really need AI to solve this problem or can traditional probabilistic methods with lower computing and energy requirements suffice? Deep learning is not the solution to all problems, so it pays to be selective when making the determination.  

Once you've clarified your business problem or use case, carefully consider the construction of your solution and model. 

  1. Emphasize data quality over data quantity. Smaller datasets require less energy for training and have lighter ongoing compute and storage implications, thereby producing fewer carbon emissions.  Studies show [JH7] that many of the parameters within a trained neural network can be pruned by as much as 99 percent, yielding much smaller, sparsified networks.
  2. Consider the level of accuracy truly needed to solve for your use case. For instance, if you were to fine tune your models for a lower accuracy intake calculation, rather than compute-intensive FP32 calculations, you can drive significant energy savings.
  3. Leverage domain-specific models and stop re-inventing the wheel. Orchestrating an ensemble of models from existing, trained datasets can give you better outcomes. For example, if you already have a large model trained to understand language semantics, you can build a smaller, domain-specific model tailored to your need that taps into the larger model's knowledge base, resulting in similar outputs but with much more efficiency.
  4. Balance your hardware and software from edge to cloud. A more heterogenous AI infrastructure with a combination of AI computing chipsets that meet specific application needs will ensure you save energy across the board, from storage to networking to compute. While edge device SWaP (size, weight, and power) constraints require smaller, more efficient AI models, AI calculations closer to where data is generated can result in more carbon-efficient computing with lower-power devices and smaller network and data storage requirements. And for dedicated AI hardware, using built-in accelerator technologies to increase performance per watt can yield significant energy savings. Our testing shows built-in accelerators can improve average performance per watt efficiency 3.9x on targeted workloads when compared to the same workloads run on the same platform without accelerators.  [JH8]
  5. Consider open-source solutions with libraries of optimizations to help ensure you're getting the best performance out of the box from your hardware and frameworks. In addition to open-source, embracing open standards can help with repeatability and scale. For example, to avoid energy intensive initial model training, consider using pre-trained models for greater efficiency and the potential for shared/federated learnings and improvements over time. Similarly, open APIs enable more efficient cross-architecture solutions, allowing you to build tools, frameworks, and models once and deploy everywhere with more optimal performance.

Like many sustainability-led decisions, designing your AI projects to reduce their environmental impact is not easy. Reducing your energy and carbon footprint requires work, intention, and compromise to make the most responsible choices. But as we see in other sustainability-led business choices, even seemingly small adjustments can create large collective improvements to reduce carbon emissions and help slow the effects of climate change.


  •  [JH1]The state of AI in 2022—and a half decade in review | McKinsey 
  •  [JH2]AI and compute (openai.com)
  •  [JH3]Alternatively could be quantified as "more than 41 round-trip flights between New York City and Sydney Australia."
  •  [JH4]Calculated with data from: https://calculator.carbonfootprint.com/
  •  [JH5]DeepMind AI Reduces Google Data Centre Cooling Bill by 40% 
  •  [JH6]World Wide Waste - Gerry McGovern 
  •  [JH7][1611.03530] Understanding deep learning requires rethinking generalization (arxiv.org) 
  •  [JH8]See [E2] at intel.com/processorclaims: 4th Gen Intel® Xeon® Scalable processors. Results may vary.
Reach out to our experts
Learn more

Technologies