Our AI overlords will need more power than the Sun generates within 43 years

My question to Grok: “Extrapolating from the past three years how long before running LLMs consumes more power than the entire Sun generates?”

Extrapolating from the growth in global AI data center power demand over the past three years (2022–2025), which serves as a proxy for LLM power consumption given that LLMs drive much of the recent AI surge, we can estimate the timeline. Data from SemiAnalysis (as cited in a RAND report) shows the following facility power demand for AI data centers:

  • 2022: 1.451 GW
  • 2023: 4.28 GW
  • 2024: 10.6 GW
  • 2025: 19.9 GW

To arrive at the extrapolation:

  1. Fit an exponential growth model to these data points. Using linear regression on the natural log of power vs. year (with 2022 as t=0), the slope is approximately 0.876, corresponding to an annual growth factor of e^0.876 ≈ 2.401 (i.e., power multiplies by about 2.4 each year on average).
  2. The Sun generates 3.826 × 10^26 watts of power, equivalent to 3.826 × 10^17 GW.
  3. Starting from 19.9 GW in 2025, the factor needed to exceed the Sun’s output is (3.826 × 10^17) / 19.9 ≈ 1.924 × 10^16.
  4. Solve for years n: 2.401^n = 1.924 × 10^16, so n = log(1.924 × 10^16) / log(2.401) ≈ 43.

Thus, it would take approximately 43 years from 2025 (i.e., by around 2068). This assumes the exponential growth rate from the past three years continues indefinitely, though in reality, physical, technological, and economic constraints would likely slow or halt such growth long before reaching solar scales.


Note that ChatGPT 5 failed to answer this question, identically worded. It used data center power consumption growth, not LLM- or AI-specific power consumption growth.

When does the exponential growth stop? We can power a human brain with an Egg McMuffin. Is the answer an analog chip with a digital interface? IEEE has a 2024 article on a Chinese design using optical computing that supposedly cuts power consumption by 99%. But if these improvements were real would the AI companies be signing 20 year leases for nuclear power?

11 thoughts on “Our AI overlords will need more power than the Sun generates within 43 years

  1. I’m not really a Muskovite, however his brain chip implant idea could offer a solution. Instead of hastily designed, inefficient, silicon ANNs, we could use HNN (human neural nets), evolved over millions of years, directly connected to the ‘Net. Suspension in a saline-solution tank optional. Or scrap AI, go back to keyboards, video, and mice, as well as jobs, and use the surplus GPUs to play games.

  2. This reminds me of extrapolations from the 1990’s about processor heat approaching that of the surface of the sun. Nicely done.

    • > inject more mass into Jupiter to make it another star

      Musky will need a cheap source of monoliths, though–e-waste maybe? Someone else knows the plot to Clarke’s 2010! I read it in ’82, I think, and remembered that.

      ALL THESE WORLDS ARE YOURS – EXCEPT EUROPA. ATTEMPT NO LANDING THERE.

      — HAL-9000

  3. Reminds me of flawed Limits to Growth, or Peak Oil hypotheses. And The Matrix: pod people making power for AI?

    “The Sun generates 3.826 × 10^26 watts of power, equivalent to 3.826 × 10^17 GW.”

    Apples and oranges: Thermal and electrical powers. Entertaining numbers though.

    World electricity generation capacity is around 9000 x10^6 kWe or 9000 GWe.

    https://www.eia.gov/international/data/world/electricity/electricity-capacity

    You’ll get to that much quicker, even if it grows too. Hydro power could increase with climate change, if you believe more storms and rainfall predictions, adjusted for inches higher sea levels naturally. Except when they have to spill excess. Wind power too, except when they trip off for high wind speeds. Don’t worry about solar cycles.

  4. > When does the exponential growth stop?

    Economists + politicians: exponential growth must go on forever

    Engineers: there are only 3 curves in nature: sinusoid, decaying exponential, S-curve

    • @perplexed Gaussian curve = 2 opposed S-curves? Golden ratio = xy sinusoid * decaying exponential?

    • Anon, how about parabola than? Trajectory of object thrown in gravitational field? It is not exponential as it is polynomial by definition. Two S-curves again? S-curves approximation is a fudging for every smooth line as its curvature is not defined. It is used in AI simplifications though.

  5. Extrapolating from the past three years how long before running LLMs consumes all of the energy we produce, assuming we can increase energy production by 3% every year?

    GPT-5:

    Short answer (with a big asterisk):
    If you (i) start from the latest IEA baseline for AI-in-data centres in 2024, (ii) assume that “running LLMs” grows as fast as AI accelerator electricity has been growing recently, and (iii) let global electricity production grow 3%/yr, then in a purely exponential, no constraints world LLM related electricity use would mathematically catch up with all global electricity generation sometime in the 2050s. Using the same math against all primary energy instead of electricity pushes the crossover toward the late 2050s to 2070s. The real world will almost certainly bend those curves long before then (see caveats). [1][2][3][4]
    Below I show the exact assumptions, the back of the envelope calculation, and several scenarios so you can see how sensitive the result is.

    1) What we can anchor on today
    • Global electricity generation (2024): Low carbon sources generated 12,609 TWh, equal to 40.9% of the mix ⇒ total generation ≈ 30,829 TWh (12,609/0.409). [3]
    • Data centre electricity (2024): ≈ 415 TWh, ~1.5% of global electricity; DC demand grew ~12%/yr in the last five years. [1][5]
    • AI share inside data centres (2024): AI/accelerator servers used ~15% of total DC electricity (and ~24% of server electricity). So AI electricity ≈ 0.15 × 415 TWh ≈ 62 TWh in 2024 (training + inference combined). [2]
    • Recent growth of AI workloads: In IEA’s 2025 Energy & AI analysis, electricity in accelerated (AI) servers grows ~30%/yr through 2030 in the Base Case; total DC electricity grows ~15%/yr. We’ll use the 30% as the “past few years trend” proxy you asked to extrapolate. [1]
    • All primary energy (2024): ~592 EJ ≈ 164,445 TWh (1 EJ = 277.78 TWh). This is for the “all energy we produce” interpretation. [4]
    Important scope note — LLMs use electricity. So the most apples to apples comparison is LLM electricity vs global electricity, not vs all primary energy. I show both because you used “all of the energy we produce.”

    2) The simple math (exponential growth race)
    Let:
    • (E{\text{LLM},0}) = 2024 LLM electricity. We don’t know the exact LLM fraction of AI electricity, so we parameterise it as (f): (E{\text{LLM},0} = f \times 62.25\text{ TWh}) (with (f) between 0.33 and 1.0 in the scenarios below). [2]
    • (E_{\text{grid},0}) = 2024 world electricity ≈ 30,829 TWh. [3]
    • Growth rates: LLM electricity grows at (g{\text{LLM}}) (we test 30%, 20%, 15%); grid electricity grows at your assumed (g{\text{grid}}=3\%).
    Solve for year (t) when (E{\text{LLM},0}(1+g{\text{LLM}})^t = E{\text{grid},0}(1+g{\text{grid}})^t).
    That gives: [ t \;=\; \frac{\ln!\Big(\frac{E{\text{grid},0}}{E{\text{LLM},0}}\Big)}{\ln !\Big(\frac{1+g{\text{LLM}}}{1+g{\text{grid}}}\Big)} ]
    I computed this for several scenarios; results are calendar years counting from 2024:
    A) Comparing to global electricity
    Assumptions (annual growth) LLM share of “AI electricity” (f) Years from 2024 Crossover year
    LLM 30%/yr, grid 3%/yr 100% (all AI is LLMs) 26.7 ≈ 2051
    60% 28.8 ≈ 2053
    33% 31.4 ≈ 2055
    LLM 20%/yr, grid 3%/yr 100% 40.6 ≈ 2065
    60% 44.0 ≈ 2068
    33% 47.9 ≈ 2072
    LLM 15%/yr, grid 3%/yr 100% 56.3 ≈ 2080
    (Baseline numbers: (E_{\text{grid},0})≈30,829 TWh; AI in DC 2024≈62.25 TWh. See sources above.) [3][2]
    B) Comparing to all primary energy
    Here we compare LLM electricity to all energy produced (592 EJ ≈ 164,445 TWh), still letting “energy production” grow at 3%/yr.
    Assumptions (annual growth) LLM share of “AI electricity” (f) Years from 2024 Crossover year
    LLM 30%/yr, energy 3%/yr 100% 33.8 ≈ 2058
    LLM 20%/yr, energy 3%/yr 100% 51.6 ≈ 2076
    LLM 15%/yr, energy 3%/yr 100% 71.5 ≈ 2095
    (Same baseline, but using total primary energy.) [4]

    3) Why these “crossover” years are upper bound hypotheticals
    1) We started from “AI in data centres,” not “LLMs only.” The best public split says AI workloads consumed ~15% of DC electricity in 2024; LLMs are a subset of that (hence the (f) parameter above). [2]
    2) IEA’s 30%/yr is a near term (to 2030) growth rate for accelerator electricity, not a forever rate. Their own Base Case emphasises large uncertainty bands, grid bottlenecks, and scenario sensitivity. [1]
    3) Efficiency is improving very fast. For example, Google’s production measurement shows the median per prompt energy for Gemini Apps is ~0.24 Wh, and they report ~33× per prompt energy improvement over a recent 12 month period due to hardware, software and serving stack optimisations. If trends like that persist, they directly reduce the effective growth of LLM electricity. [6]
    4) Physical and grid constraints are already binding in many regions. IEA notes concentration of DC demand in specific hubs and potential delays connecting planned capacity. These frictions tend to slow exponential growth curves in practice. [1]
    5) Global electricity itself may grow faster than 3% in some years (IEA measured ~4.3% in 2024), which would also push the crossover further out. [7]

    4) Bottom line, stated plainly
    • Under very aggressive assumptions (LLMs ≈ all AI, sustained 30%/yr growth vs a 3%/yr grid), you hit the “LLMs = all electricity” line around 2051. Relax those assumptions (e.g., LLMs are 33–60% of AI, or LLM growth slows to 20%/yr), and the crossover slides to mid 2060s to early 2070s. Compared against all primary energy, the crossover drifts later still (late 2050s → 2090s depending on assumptions). These are mathematical extrapolations, not forecasts. [1][2][3][4]
    • The more realistic expectation is: LLMs will not consume “all energy we produce.” Efficiency gains (model, compiler, serving, quantisation, chips), siting constraints, and the economics of power supply expansion should flatten the curve well before any such crossover. [1][6]

    5) Want a calculator?
    If you’d like, I can drop this into a one pager (Excel/Sheets) where you can set:
    • Initial LLM share of AI electricity (f)
    • LLM growth rate (with an optional taper after 2030)
    • Electricity vs primary energy comparison
    • Grid/energy growth trajectories (constant 3% or a time varying path)
    …and it’ll chart the crossover year automatically.
    Would you prefer I default the LLM fraction to 50% of AI electricity, or do you have a specific split you want to test?

    References
    [1] Energy demand from AI – Energy and AI – Analysis – IEA
    [2] Data centres will use twice as much energy by 2030 — driven by AI
    [3] Global Electricity Review 2025 Electricity – ember-energy.org
    [4] Insights by source and country | Statistical Review of World Energy
    [5] Global data center power demand to double by 2030 on AI surge: IEA
    [6] In a first, Google has released data on how much energy an AI prompt …
    [7] Electricity – Global Energy Review 2025 – Analysis – IEA

  6. Slightly unrelated, but a friend who’s pretty high up in Google was telling me that they are letting go of people in doses this time, as opposed to mass layoffs they did previously, in order to purchase hardware for AI.

Leave a Reply

Your email address will not be published. Required fields are marked *