New data centers are rapidly emerging to support growing AI demands, highlighting concerns about whether the U.S. can generate enough electricity and whether the grid can handle the load. Some chip companies are creating low-power processors to reduce data center energy use, but the energy savings aren’t sufficient. One ChatGPT query consumes 10 times as much power as a typical Google search.
AI data centers are challenging the aging U.S. power grid’s ability to transmit generated power to the centers. Predictive software and sensor technologies under development may help manage transformer loads and prevent failures.
Cooling AI data centers is another issue, requiring substantial water usage. Despite the challenges and constraints on resources, the drive for expanding AI technology remains strong.