TAIPEI (Taiwan News) — A new research report from Digitimes released Tuesday warns that current energy infrastructure will be unable to meet the massive power demands of future AI data centers.
Analysts predict operators will adopt decentralized, low-carbon power supply strategies, with facilities beginning large-scale use of green electricity in 2026 and relying primarily on it by 2030.
By 2030, the share of fossil fuels powering AI data centers is forecast to fall from today’s 70% to below 30%. Renewable energy such as wind and solar is expected to account for more than 50%, nuclear for 10–15%, geothermal for 3–5%, and hydrogen for 1–3%.
The report noted that Meta and OpenAI are planning gigawatt-scale AI data centers as early as 2026. Meta’s Prometheus and Hyperion projects are targeting a combined 6 GW, while the Stargate project, jointly developed by OpenAI and Oracle, aims for 10 GW within four years.
Digitimes said Nvidia’s GPU power consumption continues to surge, with a single AI chip projected to exceed 2 kW by 2030. This would drive global AI data center demand from megawatts to gigawatts, pushing total electricity use to as high as 1,264 TWh — two to three times today’s levels.
The study said these dual requirements of high power and sustainability will redefine data center design. Key technologies include 800V high-voltage direct current and liquid cooling.
Digitimes analyst Yu Pei-ru (余佩儒) said future AI data centers will likely adopt a co-location model, built near power generation sites such as solar and wind farms or baseload sources like geothermal and nuclear. Pairing these with storage systems and microgrids would improve energy autonomy and flexibility.





