> By 2050, the world will need nearly three times as much power as we use today
This feels like a "no one needs more than 640k of RAM"^ kind of comment.
Only triple? In 24 years? Maybe I'm misunderstanding the transition from fossil fuels to renewables where we're replacing one source for another as opposed to increasing the power use, but it does feel like the demand for power, especially with Data Centres in the current news cycle, would take us 10x in "the shortest amount of time it's possible to 10x power generation".
> Fusion: Once the technology is fully commercialized within the next decade
I don't really want to say it, but isn't the joke that fusion been a decade away for 50 years?
^I know this is not quite what was said, I'm just using it for reference.
Until recently total electrical power consumption in developed nations has been practically stalled for decades, despite continued growth in economic output, and greater demand for electricity consuming devices. Mostly because as the usage of electricity has gone up, it’s been entirely offset by endless improvements in efficiency.
So you have two opposed forces in action. Rapidly increasing demand for electricity consuming services, and rapidly increasing efficiency of those services. It also helps that a lot of that additional demand is only possible due to increased efficiency. Imagine if every phone was as power inefficient as an old Pentium 4. They would last about 30 mins and burn your hands in the process.
Even with datacentres and AI, there is huge economic pressure to increase the efficiency of the devices involved, and there’s been no slow down in year-on-year increases of compute/W, even if the total amount compute per chip isn’t as rapid as it used to be.
It's most likely because of deindustrialization of those countries, not efficiency increase [1]. Aluminium production (electrolysis), steel production (arc furnace), heavy manufacturing (lathes, drills, welding, various motors, robots, etc.) were all moved elsewhere.
You may argue that Jevon's paradox might not apply to home power use. I mean, how many lights and how many refrigerators could one house possibly have? But AI use and it's associated power consumption is VERY susceptible to Jevon's paradox.
> I mean, how many lights and how many refrigerators could one house possibly have?
Once-upon-a-time a "bright" bulb consumed 80W of power. Today we achieve the same light output with 8W of LEDs, and order of magnitude decrease in power consumption. Multiply that by every home, every street light, every office build, every airport, hospital and stadium. That kinda improvement in efficiency adds up very quickly.
Sure plenty of developed countries have deindustrialised, but most the heavy industry that been lost was lost before mass electrification. Arc furnaces, large scale aluminium production etc. These are all pretty modern technologies. If you look at the UK the only major steel foundry left is a gas fired blast furnace, we basically have zero arc furnaces, because we deindustrialised before the damn things we're being used for large scale steel production.
Electrification of heavy industry is a surprisingly recent trend, and is only really happening in countries that aren't deindustrialising, and view continual process improvement in heavy industry as an important long term activity.
Also I think climate control is quite susceptible to Jevin's paradox, especially since heavier use of air conditioning requires heavier use of air conditioning by other people in the area.
> The report finds that data centers consumed about 4.4% of total U.S. electricity in 2023 and are expected to consume approximately 6.7 to 12% of total U.S. electricity by 2028.
Even if data centers went 10x, that would only increase our electricity use by a bit over a third.
.
But for something more fun:
This[1] says global energy use is 186,000 TWh/year. Or an average of about 21.2 TW.
The surface area of the earth[2] is 510 million km^2, or 510 trillion m^2.
Which works out to global energy use being an average of about 42 mW / m^2.
Per Wikipedia[3], the IPCC says that human-caused greenhouse warming is 2.72 W/m^2 .
Which is "only" about 65 times global energy use.
Which means if we did start using double-digit multiples of our current energy use, it starts to matter whether we're adding that energy to the environment (fission/fusion, fossil, probably geothermal) or just redirecting it (hydro, wind, solar). With the caveat for solar that the panels probably have lower albedo than what they're on top of.
Many advanced economies have seen declining energy use per capita, and flat energy use overall, in the last few decades. It is not unreasonable to assume this trend will continue and spread as more countries become wealthy.
I don't understand this love affair with nuclear energy, especially in a country full of sunny deserts. Cover a fraction of it in solar panels coupled with sodium batteries, and the problem is solved. But for some reason this idea is not being considered seriously. Why?
I understand in the 50's we needed reactors to create plutonium to fend off russians.
I understand in the 80's the solar panels were expensive.
But now, when the panels are cheap and lithium batteries are cost competitive and sodium batteries are being actively developed (and already put into cars), there is simply no excuse.
19 comments
> By 2050, the world will need nearly three times as much power as we use today
This feels like a "no one needs more than 640k of RAM"^ kind of comment.
Only triple? In 24 years? Maybe I'm misunderstanding the transition from fossil fuels to renewables where we're replacing one source for another as opposed to increasing the power use, but it does feel like the demand for power, especially with Data Centres in the current news cycle, would take us 10x in "the shortest amount of time it's possible to 10x power generation".
> Fusion: Once the technology is fully commercialized within the next decade
I don't really want to say it, but isn't the joke that fusion been a decade away for 50 years?
^I know this is not quite what was said, I'm just using it for reference.
So you have two opposed forces in action. Rapidly increasing demand for electricity consuming services, and rapidly increasing efficiency of those services. It also helps that a lot of that additional demand is only possible due to increased efficiency. Imagine if every phone was as power inefficient as an old Pentium 4. They would last about 30 mins and burn your hands in the process.
Even with datacentres and AI, there is huge economic pressure to increase the efficiency of the devices involved, and there’s been no slow down in year-on-year increases of compute/W, even if the total amount compute per chip isn’t as rapid as it used to be.
You may argue that Jevon's paradox might not apply to home power use. I mean, how many lights and how many refrigerators could one house possibly have? But AI use and it's associated power consumption is VERY susceptible to Jevon's paradox.
[1] https://en.wikipedia.org/wiki/Jevons_paradox
> I mean, how many lights and how many refrigerators could one house possibly have?
Once-upon-a-time a "bright" bulb consumed 80W of power. Today we achieve the same light output with 8W of LEDs, and order of magnitude decrease in power consumption. Multiply that by every home, every street light, every office build, every airport, hospital and stadium. That kinda improvement in efficiency adds up very quickly.
Sure plenty of developed countries have deindustrialised, but most the heavy industry that been lost was lost before mass electrification. Arc furnaces, large scale aluminium production etc. These are all pretty modern technologies. If you look at the UK the only major steel foundry left is a gas fired blast furnace, we basically have zero arc furnaces, because we deindustrialised before the damn things we're being used for large scale steel production.
Electrification of heavy industry is a surprisingly recent trend, and is only really happening in countries that aren't deindustrialising, and view continual process improvement in heavy industry as an important long term activity.
> The report finds that data centers consumed about 4.4% of total U.S. electricity in 2023 and are expected to consume approximately 6.7 to 12% of total U.S. electricity by 2028.
Even if data centers went 10x, that would only increase our electricity use by a bit over a third.
.
But for something more fun:
This[1] says global energy use is 186,000 TWh/year. Or an average of about 21.2 TW.
The surface area of the earth[2] is 510 million km^2, or 510 trillion m^2.
Which works out to global energy use being an average of about 42 mW / m^2.
Per Wikipedia[3], the IPCC says that human-caused greenhouse warming is 2.72 W/m^2 .
Which is "only" about 65 times global energy use.
Which means if we did start using double-digit multiples of our current energy use, it starts to matter whether we're adding that energy to the environment (fission/fusion, fossil, probably geothermal) or just redirecting it (hydro, wind, solar). With the caveat for solar that the panels probably have lower albedo than what they're on top of.
[1] https://ourworldindata.org/energy-production-consumption
[2] https://www.universetoday.com/articles/surface-area-of-the-e...
[3] https://en.wikipedia.org/wiki/Radiative_forcing
I understand in the 50's we needed reactors to create plutonium to fend off russians.
I understand in the 80's the solar panels were expensive.
But now, when the panels are cheap and lithium batteries are cost competitive and sodium batteries are being actively developed (and already put into cars), there is simply no excuse.
Why then?