Hyperscalers have already outspent most famous US megaprojects (twitter.com)

by nowflux 283 comments 282 points
Read article View on HN

283 comments

[−] timmg 27d ago
This tweet shows it as a percentage of US GDP:

https://x.com/paulg/status/2045120274551423142

Makes it a little less dramatic. But also shows what a big **'n deal the railroads were!

[−] manquer 27d ago
GDP adjustments are warranted, but it is more stark than both the estimates suggest.

The megaprojects of the previous generations all had decades long depreciation schedules. Many 50-100+ year old railways, bridges, tunnels or dams and other utilities are still in active use with only minimal maintenance

Amortized Y-o-Y the current spends would dwarf everything at the reported depreciation schedule of 6(!) years for the GPUs - the largest line item.

[−] gravypod 27d ago
The side effects of spending funds on these mega projects is also something to consider. NASA spending has created a huge pile of technologies that we use day to day: https://en.wikipedia.org/wiki/NASA_spin-off_technologies.
[−] tremon 26d ago
Maybe if we'll get rack-sized fusion reactors out of it, I will consider the AI/Datacenter spending craze in the same light as NASA projects. Until then, they are rich kids' vanity projects and nothing more.
[−] delusional 27d ago

> NASA spending has created a huge pile of technologies that we use day to day

We're a little too early to know if that's the case here too. I do foresee a chance at a reality where AI is a dead end, but after it we have a ton of cheap GPU compute lying about, which we all rush to somehow convert into useful compute (by emulating CPU's or translating traditional algorithms into GPU oriented ones or whatever).

[−] theptip 26d ago
If all AI progress somehow immediately halted, the models that have currently been built will still have more economic impact than the Internet.

Not least because the slower the frontier advances, the cheaper ASICs get on a relative basis, and therefore the cheaper tokens at the frontier get.

We have a massive scaffolding capability overhang, give it ten years to diffuse and most industries will be radically different.

Again, all of this is obvious if you spend 1k hours with the current crop, this isn’t making any capability gain forecasts.

Just for a dumb example, there is a great ChatGPT agent for Instacart, you can share a photo of your handwritten shopping list and it will add everything to your cart. Just following through the obvious product conclusions of this capability for every grocery vendor’s app, integrating with your fridge, learning your personal preferences for brands, recipe recommendation systems, logistics integrations with your forecasted/scheduled demand, etc is I contend going to be equivalent engineering effort and impact to the move from brick and mortar to online stores.

[−] delusional 26d ago
You have to agree that it's totally possible that none of those things you are envisioning getting built out actually end up working as products, right?

AI (LLM) progress would stop, and then everything people try to do with those last and most capable models would end up uninteresting or at least temporary. That's the world I'm calling a "dead end".

No matter how unlikely you think that is, you have to agree that it's at least possible, right?

[−] theptip 26d ago

> then everything people try to do with those last and most capable models would end up uninteresting

I believe that some of my made up examples won’t end up getting built, but my point is that there is _so much_ low hanging fruit like this.

Of course, anything is _possible_, but let’s talk likelihood.

In my forecast the possible worlds where progress stops and then the existing models don’t end up making anything interesting are almost exclusively scenarios like “Taiwan was invaded, TSMC fabs were destroyed, and somehow we deleted existing datacenters’ installed capacity too” or “neo-Luddites take over globally and ban GPUs”, all of this gives sub-1% likelihood.

You can imagine 5-10% likelihood worlds where the growth rate of new chips dramatically decreases for a decade due to a single black-swan event like Taiwan getting glassed, but that’s a temporary setback not a permanent blocker.

Again, I’m just looking at all the things that can obviously be built now, and just haven’t made it to the top of the list yet. I’m extremely confident that this todo list is already long enough that “this all fizzles to nothing” is basically excluded.

I think if model progress stops then everyone investing in ASI takes a big haircut, but the long-term stock market progression will look a lot like the internet after the dot com boom, ie the bloodbath ends up looking like a small blip in the rear view mirror.

I guess, a question for you - how do you think about coding agents? Don’t they already show AI is going to do more than “end up uninteresting”?

[−] Anamon 25d ago
Coding agents are interesting, but in my opinion also many worlds away from what they're being sold as. They can be helpful and a moderate efficiency gain, if you know where to use them and you're careful to not fall into one of their many traps where they end up being a massive cost and efficiency loss down the line. They're helpful tools, but they're slow, expensive, and unreliable -- in order of decreasing likelihood that that's going to change in a big way.

I find it interesting that you chose the shopping list and fridge examples, because my view on the whole LLM hype is that 99% of it is a solution looking for a problem, and shopping and the fridge are historically such a commonly advertised area for technologies desparately looking for an actual use case. I don't think fridge content management and shopping plans are actual pain points in most people's lives. It's not something people would see a benefit in if they didn't have to do it manually. And it's an area with a very low tolerance for the systemic unreliability. The guy needed eggs to bake his cake, but the AI got him eggos instead -- et voilà, another person who thinks this whole "smart" technology is shit and won't deal with it anymore.

And so it goes with most AI use cases I've seen so far. In my view the only thing they're good at is fuzzy search. Coding agents are helpful, but in the end, their secret sauce it just that: fuzzy search.

Can fuzzy search be helpful? Yes, even very helpful! "Bigger than the Internet" helpful? I think not.

[−] delusional 26d ago

> Of course, anything is _possible_, but let’s talk likelihood.

The problem with talking likelihood is that it's an interpretation game. I understand you think it's wholly unlikely that it all fizzles out, I could read that from your first post. I hope it's also clear that I do think it's likely.

That's the point where we have to just agree to disagree. We have no rapport. I have no reason to trust your judgment, and neither do you mine.

[−] datatrashfire 26d ago
i feel a lot of people in tech have this incuriously deterministic attitude about llms right now… previous revolutionized the world, therefore llms will! despite there really nothing to show for it so far other than writing rote code is a bit easier and still requires active baby sitting by someone who knows what they are doing
[−] petra 26d ago
Even if chatbot LLM's stop at their current capability, There's a whole ecosystem of scientific language models(in drug discovery, chemistry, materials design, etc), and engineering language models(software, chip design, etc) that are very valuable in their fields.

And even if chatbot LLM's seem to be a dead end, them and other machine learning algo's will be happy to use the data centers to create/discover a lot of stuff.

[−] m_mueller 27d ago
e.g. the climate models that could be run on some of these systems would dwarve anything we’ve been able to do so far.
[−] TeMPOraL 27d ago
AI progress may fizzle out, but everything it produced so far would still be there. Models are just big bags of floats - once trained, they're around forever (well, at least until someone deletes them), same is true about harnesses they run in (it's just programs).

But AI proliferation is not stopping soon, because we've not picked up even the low hanging fruits just yet. Again, even if no new SOTA models were to be trained after today, there's years if not decades of R&D work into how to best use the ones we have - how to harness the big ones, where to embed the small ones, and of course, more fundamental exploration of the latent spaces and how they formed, to inform information sciences, cognitive sciences, and perhaps even philosophy.

And if that runs out or there is an Anti AI Revolution, we can still run those weather models and route planners on the chips once occupied by LLMs - just don't tell the proles that those too are AI, or it's guillotine o'clock again.

[−] vrighter 25d ago
What will happen is that new buzzwords will be invented, and a new fad will take its place. And we will be stuck with the short end of the stick again. You can hope, but shit doesn't really get cheaper for us common folk, ever. :/
[−] PunchyHamster 27d ago
I think there is little chance it is a "dead end", it's here to stay but at least LLMs seem to have hit the diminishing returns curve already, despise what investors might think, and so far none of the big providers actually makes money for all that investment
[−] lukewarm707 27d ago
when ai is dead we can use all those gpus for zucc's metaverse xD s
[−] Lerc 27d ago
The shovels and labour used to make those things where not depreciated.

The GPUs are the shovels, not the project. AI at any capability will retain that capbibilty forever. It only gets reduced in value by superior developments. Which are built upon technologies that the previous generation developed.

[−] pembrook 27d ago
Only half of the rail capacity that existed during the railroad boom times was still in use by the 1970s. Lots of it was never really used at all after various railroads went bankrupt. But your point still stands.

That said, I'm pretty sure in a compute-hungry AI world you aren't going to retire GPUs every 6 years anymore. Even if compute capacity jumps such that current H100s only represent 10% of total compute available in 6 years, you're still running those H100s until they turn to dust.

I just think it's hard to compare localized railroad infrastructure to globalized AI capacity and say one was more rational than the other on a % of GDP basis until the history actually plays out.

If you compare global investment in nuclear weapons it would dwarf the manhattan project and AI thus far, and yet, 99.99999% of nuclear weapons investment is just "wasted" capacity in that it has never been "used." But the value it has created in other ways (MAD-enabled peace) has surely been profitable on net. Nobody would have predicted this at the time.

Playing armchair internet pessimist about the "new thing" always makes you feel smart but is usually not a good idea since you always mis-price what you don't know about the future (which is almost everything).

[−] phreeza 26d ago
That's definitely true for some of them, but for others it's not so clear, like the Apollo or Manhattan projects? Those of course also have lasting impact but it's more in terms of knowledge, which at least arguably we are also accruing with these data centers.
[−] elil17 27d ago
I think there's more nuance to it. The real asset is the models that are being created.

Imagine this world: the bubble "pops" in a couple years. The GPUs stick around for a few more years after that. At the end, we pretty much don't train new foundation models anymore - no one wants to spend the money on the hardware needed to make a real advance.

People continue to refine, distill, and optimize the existing foundation models for the next century or two, just like people keep laying new track over old railway right of ways.

[−] brookst 27d ago
I’m not sure tax depreciation rates are the best measure here. Those GPUs will be used for much longer than 6 years, and the returns from the businesses will be an order of magnitude longer.
[−] wr2 27d ago
Also railways would always have alternative uses at that time - e.g. logistics in warfare.

What other uses do GPU's have that are critical...? lol

In addition to your points, this is why I always laugh when people do backward comparisons. What characteristics do they share in common? Very little.

[−] rayiner 27d ago
Great point!
[−] tripletao 27d ago
This seems to show the railroads peaking around 9% of GDP. While that's lower than some of the other unsourced numbers I've seen, it's much higher than the numbers I was able to find support for myself at

https://news.ycombinator.com/item?id=44805979

The modern concept of GDP didn't exist back then, so all these numbers are calculated in retrospect with a lot of wiggle room. It feels like there's incentive now to report the highest possible number for the railroads, since that's the only thing that makes the datacenter investment look precedented by comparison.

[−] chromacity 27d ago
But doesn't that overstate it in the other direction? Talking about investments in proportion to GDP back when any estimate of GDP probably wasn't a good measure of total economic output?

We're talking about the period before modern finance, before income taxes, back when most labor was agricultural... Did the average person shoulder the cost of railroads more than the average taxpayer today is shouldering the cost of F-35? (That's another line in Paul's post.)

[−] SlinkyOnStairs 27d ago

> Makes it a little less dramatic. But also shows what a big *'n deal the railroads were!

It also makes it more dramatic, consider the programs on the list and what they have in common.

* The Apollo program. A government-funded science project. No return on investment required.

* The Manhattan Project. A government-funded military project. No return on investment required.

* The F-35 program. A government funded military project. No return on investment required.

* The ISS. A government funded science project. No return on investment required.

* The Interstate Highway System. A government funded infrastructure project. No return on investment required.

* The Marshall Plan. A government funded foreign policy project. No return on investment required.

The actual return on investment for these projects is in the very long term of decades; Economic development, national security, scientific progress that benefits the entire country if not the entire world.

Consider the Marshall Plan in particular. It's a massive money sink, but it's nature as a government project meant it could run at losses without significant economic risk and could aim for extremely long term benefits. It's been paying dividends until January last year; 77 years.

And that dividend wasn't always obvious; Goodwill from Europe towards the US is what has prevented Europe from taking similar actions as China around the US' Big Tech companies. Many of whom relied extensively on 'Dumping' to push European competitors out of business, a more hostile Europe would've taken much more protectionist measures and ended up much like China, with it's own crop of tech giants.

And then there's the two programs left out. The railroads and AI datacenters. Private enterprise that simply does not have the luxury of sitting on it's ass waiting for benefits to materialize 50 years later.

As many other comments in this thread have already pointed out: When the US & European railroad bubbles failed, massive economic trouble followed.

OpenAI's need for (partial) return on investment is as short as this year or their IPO risks failure. And if they don't, similar massive economic trouble is assured.

[−] cousin_it 27d ago
Wild graphic. US spending on one flying killing machine (the F-35) is comparable to total spending on the Marshall plan to reconstruct Europe after WWII, or the interstate highway system, or all datacenters combined. Priorities!
[−] dghlsakjg 27d ago
The railroads and the interstate are arguably the biggest and broadest impact, especially in 2nd order effects (everything West of the Mississippi would be vastly different economically without them).

I am not an ai-booster, but I would not be surprised at AI having a similar enabling effect over the long term. My caveat being that I am not sure the massive data center race going on right now will be what makes it happen.

[−] LeCompteSftware 27d ago
It seems a little silly to put 71 years of private-and-public-sector infrastructure development alongside something highly targeted like the Manhattan Project. It might make more sense to compare the Manhattan Project to the first transcontinental railroad, as a similar targeted but enormously ambitious project amounting to a major technical milestone.

Likewise I don't think it makes sense to compare post-ChatGPT hyperscaler data center construction with all 19th-century US railroad construction. Why not include the already considerable infrastructure of pre-AI AWS/Azure? The relevant economic change isn't "AI," it's having oodles of fast compute available online and a market demanding more of it. OTOH comparing these data centers to the Manhattan Project is wrong in the opposite direction: we should really be comparing a specific headline-grabber like Stargate.

This categorization is just a confusing mishmash. The real conclusion to draw here is that we tend to spend more on long-term and broadly-defined things than we do on specific projects with specific deadlines. Indeed.

[−] maxglute 27d ago
Depreciation schedule:

Tulips: weeks

GPUs: 6 years

Fiber: 20-50 years

Rail, roads, bridges: 50-100+ years

Hyperscalers closer to tulips than other hard infra.

[−] comfysocks 26d ago
Railroad looks huge on the GDP (estimate) chart because the US transcontinental railroad was built in the mid 1800’s when the US economy was relatively tiny.
[−] chatmasta 27d ago
I’m surprised there is no broadband rollout or telecom network on there. I guess it’s hard to quantify the cost within a specific event?
[−] j-bos 27d ago
As sibling comments mentioned deceptive comparison as well. How about comparing in percentage of Gross Energy Output. https://www.sciencedirect.com/science/article/abs/pii/S09218...
[−] hyperbovine 27d ago
The railroad buildout was a lot more, idk, tangible. Most of that money was spent employing millions of people to smelt iron, lay track, build bridges, blow up mountains, etc. It’s a lot more exciting than a few freight loads of overpriced GPUs.
[−] globular-toast 27d ago
Were? How else do you expect to get goods around by land?
[−] lukeschlather 27d ago
This seems like a total category error. The Railroads are the only example that actually seems comparable, in being an infrastructure build out that's mostly done by a variety of private companies. Examples of things that would be worth comparing to the datacenter boom are factory construction and utilities (electrification in the first half of the 20th century, running water, gas pipes.)
[−] operatingthetan 27d ago
Is this an appropriate spend and risk? I'm starting to feel as if we have been collectively glamoured by AI and are not making sound decisions on this.
[−] spprashant 27d ago
I think all misgivings about AI would go away fast, if it solved one important problem for humanity. Carbon nanotubes for space elevators, sustainable nuclear fusion, or something in that ilk.
[−] therein 27d ago
I really dislike the term hyperscaler. Comes off very insincere. They came up with it themselves, didn't they? What's the official definition supposed to be now? Companies that are setting up as many GPU/TPU server clusters as possible for a demand that's yet to exist?
[−] dlenski 27d ago
There's a pretty big missing case in this comparison: nuclear weapons.

The US spent ~$12 trillion in ~2024 dollars on nuclear weapons between 1940 and 1996, and the vast majority of that spending was in the 1950s and early 1960s.

https://en.wikipedia.org/wiki/Nuclear_weapons_of_the_United_...

[−] hargup 27d ago
Justin Lebar (he built xla compiler and worked at OpenAI) has an amazing talk about this subject https://youtu.be/cyJU32ivIlk?si=gYuHtzMJIvaSqcht
[−] mattas 27d ago
Is this _actual_ spend? Like dollars actually changing hands?

Or is this "we said we are going to invest $X"? What about the circular agreements?

[−] uejfiweun 27d ago
Does anyone have any plans for what to do with all these chips and things once they are obsolete? I can't imagine they are all just going to go to some scrap heap.
[−] losvedir 27d ago
Does anyone know what's included in "datacenter capex"? In particular, does that include spending for associated power generation? Because whether or not the AI craze pans out, if we've built a whole bunch of power plants (and especially solar, wind, hydro, etc) that would be a big win.
[−] SpicyLemonZest 27d ago
Gentle reminder that the cost of producing well-formatted graphs is much, much lower than it used to be. We grew up in a world where the mere existence of this graph would prove that someone put a great deal of effort into making it, and now it does not. I have no specific reason to doubt the information, but if you want to have reliable epistemic practices, you can no longer treat random graphs you find on social media as presumptively true.
[−] djoldman 27d ago
Just for context, Amazon+Microsoft+Alphabet+Meta+Oracle total revenue for the 5 years ending in 2025 was...

~$6.5 trillion

[−] danielmarkbruce 26d ago
It's not a project. It's just a lot of money being spent on compute across hundreds/thousands of similar projects.

An analogy would be "all the money spent on transportation infra" over some period of time.

[−] kerblang 27d ago
Adjusted for inflation?

edit - sorry, it is in fact adjusted, text is kinda hard to see

[−] philip1209 27d ago
Would love to see Apple’s china investment on this chart.
[−] throwaway27448 27d ago
Further evidence that the US, for whatever reason, lacks basic ability to rationally use resources.
[−] arisAlexis 27d ago
Because this is the last invention of man and they realize this
[−] negura 27d ago
as of november last year, data centre capex was only 60% of their revenues. which provides the bussiness justification to increase investment further
[−] abofh 27d ago
Not if you include tax breaks as mega projects
[−] amelius 27d ago
We could have had a space elevator by now.
[−] bawana 27d ago
only 20% of health care spending!
[−] thelastgallon 27d ago
I wonder what percentage of GDP is spent on crypto.
[−] hashmap 26d ago
if you think datacenters are a waste (they are), wait til you hear about department of war spending
[−] metalman 27d ago
we, the people, are the ultimate mega project, and it's showing
[−] jgalt212 27d ago
Just wait until the DAOs become agentic!
[−] cactacea 27d ago
Really shows where our priorities are at as a country. SMH