A new report by the Berkeley Lab, which analyzes data center electricity demand, forecasts that it is exploding from an already high 4.4% of all U.S. electricity usage, to a possible 12% of electricity usage in just over three years, by 2028. (In Ireland, data centers already consume 18% of total electricity generation.) According to the report, which is issued periodically, data center energy use was stable with little growth from 2010-16, but that appeared to change from 2017 on, with the use of the data centers and “accelerated servers” to power artificial intelligence applications for the military-industrial complex and consumer products and services.
Many claims are now made that this data center/AI revolution has produced a surge in capital investment, in semiconductor technology, and in U.S. economic productivity. For a typical example, a JPMorgan Chase bank analysts’ report boasts of 2.2% labor productivity growth in the third quarter and says, “the widespread integration of artificial intelligence (AI) emerged as a transformative force with the potential to further elevate productivity.” But the “labor productivity” economists talk about always booms when deep recessions lay off millions, and did “boom” in 2020-22, so it is not a real economic progress indicator.
Much clearer is total factor productivity, the increase in economic growth estimated to arise from technological progress, after changes in labor force employment, skill and education, and capital investment, are accounted for. It could be called “technological productivity growth.”