Partner POV | Debunking the AI energy shortage: Fact or fiction?
In this article
This article was written by Joseph Reele, Vice President, Datacenter Solution Architects at Schneider Electric.
We've all seen the predictions that generative AI will drive unprecedented demand for electricity to run data centers. It has caused a lot of handwringing over how quickly we add AI data center capacity and where we'll find all the required power. Don't panic! The situation isn't as dire as it seems when you consider these factors:
- AI Evolution – Like other technologies, AI will become more efficient and cheaper to operate as access increases.
- Edge computing – Compute, network, and storage are no longer the exclusive domain of data centers. For many applications, the edge is where it's at.
- On-site power generation—Thanks to continued innovation in new energy resources, the new energy landscape has come to life (BESS, fuel cells, etc.). Power generation is no longer limited to utilities, and distributed energy resources are becoming more available.
- Sustainable practices – The proliferation of energy-efficient products, the trend toward improved power management, along with new policies and regulations, will ease the electricity challenge. Not to mention how AI is helping to unlock operational efficiencies and stranded capacity.

Let's take a closer look at each of these factors:
AI evolution
AI is following the same pattern of evolution as many other technologies and inventions that came before. Take the automobile. Only wealthy people could afford cars at first, but as economies of scale were achieved, cars became accessible to just about everyone. You could say the same about air travel, PCs, big-screen TVs – you name it. AI is no different. As algorithms become smarter and more efficient, AI will be more widely accessible. Think about it: It's already available in newer smartphones and PCs, and you can get an AI response in some search engines.
Edge computing
The concern about powering data centers is understandable. However, remember that not everything runs in data centers. A lot of compute, network, and storage occurs at the edge – which could be your home or office. You can't have smart devices, leading to smart grids, cities, buildings, and homes, without some computing, networking, and storage capabilities locally. This must be close to the user to minimize latency, ensure quality, and maintain security. Life happens at the edge, and we are now seeing smarter, more connected, and more autonomous use cases being developed and deployed.
Smart, connectable products and systems that enable smart lighting and HVAC, support smart manufacturing, and connect household systems like baby monitors require a local network and don't necessarily always travel to a data center. For example, if you are home and the internet goes out, you can likely switch your home lights on and off via your handheld device, tablet, or computer that's connected to your home network. Or, in a hospital with a building automation system, the operators can still control the heating and cooling plant via the local network and control room consoles, even if the internet goes out.
On-site power generation
The demand for electricity is at an all-time high, coupled with supplies being constrained from electrical utility providers. Meanwhile, technology is requiring more power to maximize its "productivity" (growth in data centers, both AI and cloud), and there is a reshoring of manufacturing and industry, which also have large load requirements.
These types of facilities often require standby (backup emergency) power generation on-site equal to the utility supply. In the case of most data centers, there is some redundancy in those standby power generation plants. The new energy landscape continues to develop — and at scale.
Solar, wind, fuel cells, linear generators, and even the advancement of nuclear power technology are and will become a more prevalent part of the "grid". Now add in the advancement of facility "Adaptive Electrical Energy" hardware and solutions, where on-site systems become more bi-directional. This bidirectional capability enables you to communicate and react to grid conditions, as well as provide electricity generated by on-site power generation systems back to the grid. This could also include any home or building with solar panels and battery storage systems that generate their own power and can feed surplus electricity to the grid.
The shift to adaptive electric energy, of course, will vary based on region, due to local jurisdiction policies and regulations. Self-sufficiency enables smart homes and buildings, which in turn enable the smart grid and smart cities. Homes and buildings such as data centers, hospitals, factories, and warehouses that generate and store power on-site will help alleviate the electricity crunch. So, yes, AI will drive demand. Still, thanks to the increasingly distributed nature of computing, power generation, and the advancement of digital systems, it won't be as bad as we may fear.
Sustainable practices
Just about every organization under the sun wants to decarbonize by employing sustainable practices. More than 60% of energy in the U.S. is wasted through inefficiencies, something that will change as we get smarter about consumption. Think about how carmakers started harnessing braking energy to power car battery systems. These types of innovations will occur on a large scale and help address the energy challenge. Companies like Schneider Electric are developing some products with sustainable materials that are significantly more energy-efficient and digitally connected than their predecessors. This includes the development and use of advanced software solutions embedding AI predictability and advanced analytics. At the same time, organizations are implementing comprehensive energy management systems and practices that boost efficiency and reduce energy consumption. The future is more digital, efficient, and sustainable.
Successfully scaling for AI energy needs
By now, you've figured out that I'm not overly worried about AI's effect on energy consumption. That isn't to say we sit back and do nothing. The factors I've discussed will help, and Schneider Electric has the solutions, programs, and practices to help with your energy needs and sustainability challenges. We offer hardware, software, services, and consulting solutions that cover power and grid, back-of-meter power at scale, modular data centers, as well as offerings encompassing all electrical and mechanical technologies (including liquid cooling) for various environments (from grey space to white space), ranging from hyperscale to the edge, from the grid to the chip and the chip to the chiller. We also provide consulting services to help build and run data centers more effectively and efficiently.
Make no mistake about it—there certainly is an energy challenge, but it might not be as dire as it has been portrayed. Better energy management, distributed energy resources, and policies and regulations that align with more efficient and optimized energy use will enable us to meet the AI energy challenge together.