Pull to refresh

The RAM shortage could last years (theverge.com)

by omer_k 523 comments 354 points
Read article View on HN

523 comments

[−] stuxnet79 26d ago
Ok so Samsung, SK Hynix and Micron do not have the capacity to meet demand. Also, what little capacity they do have they are allocating to HBM over DRAM. Based on my limited knowledge HBM can not be easily repurposed for consumer electronics. Translation: main street is cooked for the next 3-4 years.

It doesn't stop there though. OpenAI is currently mired in a capital crunch. Their last round just about sucked all the dry powder out of the private markets. Folks are now starting to ask difficult questions about their burn rate and revenue. It is increasingly looking like they might not commit to the purchase order they made which kick-started this whole panic over RAM.

Soo ... how sure are we that the memory makers themselves are not going to be the ones holding the bag?

[−] torginus 26d ago
The Radeon VII came out in 2019 as a $700 consumer GPU with an 1TB/s HBM2 memory subsystem which is more than any consumer GPU you can get today, including the high-end ones afaik. At that point in time, there was a whole lineup of AMD GPUs with HBM going down into the midrange.

If they could make this stuff and sell it to regular people a decade ago for very palatable prices, why do they come up with the idea that this is the technology of the gods, unaffordable by mere mortals?

[−] HerbManic 25d ago
I have been wondering this recently. It was the convention that if you wanted to keep costs down, try to keep the memory bus size down as low as possible. Still remember the awful Radeon 9200 SE - 64bit data bus that strangled an already slow GPU.

Heck, I have a phone with a 16bit memory bus for instance. The high(ish) clock rate only makes up the difference slightly.

But with general prices on all components going up, it might not be such a big factor any more.

HBM migght make sense for higher end products which can free up space for the lower end that will never use the tech.

[−] fennecbutt 24d ago
Eh I feel like the memory bus width thing was more a case of binning memory controllers and the like.

Designing a part with a wide bus and putting the traces down on the board is what I would expect to be the easy part these days (surely).

But yield, yield comes for us all.

[−] mlvljr 25d ago
[dead]
[−] parl_match 25d ago

> why do they come up with the idea that this is the technology of the gods, unaffordable by mere mortals?

because the gods want it all and are willing to pay top dollar.

[−] varispeed 25d ago
Isn't this the case of money going from left pocket to the right, since these companies are owned by the same investment funds?

I wonder whether this is some kind of a racket.

[−] high_na_euv 25d ago
"Owned"? You mean they invested
[−] conception 25d ago
Investors are owners, yes.
[−] unmole 25d ago

> Isn't this the case of money going from left pocket to the right, since these companies are owned by the same investment funds?

No.

[−] mitjam 24d ago
Yes, it's interesting that HBM was invented by a collaboration between AMD and SK Hynix. It seems, HBM is the way to go for GPUs, anyway.

The GB202 die that's in the GDDR7 based RTX 5090 and RTX 6000 Pro literally needed to be this big to support the 512bit memory bus. It's probably only getting worse with smaller node sizes. (see https://www.youtube.com/watch?v=rCwgAGG2sZQ&t=65s).

BTW: The 1TB/s is matched by RTX4090 and surpassed by the RTX5090 (1,79 TB/s).

[−] tpm 25d ago

> 1TB/s HBM2 memory subsystem which is more than any consumer GPU you can get today

5090 has 1.8 TB/s?

[−] adrian_b 25d ago
5090 is an overpriced outlier. A typical consumer GPU, like RTX 5070, has a 3-times lower memory throughput.

Even a RTX 5080 has a lower memory throughput than a Radeon VII from 2019, 7 years ago, while being much more expensive.

The memory throughput of GPUs per dollar has regressed greatly during the last 5 years, despite the fact that the widths of the GPU memory interfaces have been reduced, in order to decrease the production costs.

RTX 5080 has a 256-bit memory interface, while the much cheaper Radeon VII had an 1024-bit memory interface. RTX 5080 has almost 4-times faster memories than Radeon VII, but it has not used this to increase the memory throughput, but only to reduce the production costs, while simultaneously increasing the product price.

[−] tpm 25d ago

> Even a RTX 5080 has a lower memory throughput than a Radeon VII from 2019, 7 years ago, while being much more expensive.

And it's faster for gaming, I guess? Which is what matters for the typical user.

Anyway you can buy much faster GPUs now than in 2019. They are also much more expensive, yes.

[−] adrian_b 25d ago
Modern GPUs like RTX 5080 are much faster for the applications that are limited by computational capabilities, mainly because they have more execution units, whose clock frequencies have also increased.

I suppose that most games are limited by computation, so they are indeed much faster on modern GPUs.

However, there are applications that are limited by memory throughput, not by computation, including AI inference and many scientific/technical computing applications.

For such applications, old GPUs with higher memory throughput are still faster.

This is why I am still using an old Radeon VII and a couple of other ancient AMD GPUs with high memory throughput.

Last year I have bought an Intel GPU, which is still slower than my old GPUs, but it at least had very good performance per dollar, competitive with that of the old GPUs, because it was very cheap, while the current AMD and especially NVIDIA GPUs have poor performance per dollar.

[−] dietr1ch 25d ago
then it must be the case you can't get one (for a fair price?)
[−] platevoltage 25d ago
I was gonna say, I still use an AMD Vega that uses HBM2.
[−] rasz 25d ago
And the bottleneck at the time was HBM interposers, not actual ram dies.
[−] tiberious726 24d ago
The fury X was a beast for a consumer card, sadly limited by having a mere 4gb of (hbm) VRAM, and a non-refillable AIO. But back when games could fit in 4gb, it was incredible.

I'd absolutely buy another hbm consumer GPU if it had at least 8gb (and if I got the vibe/hope AMD will actually support for a couple years...)

[−] cco 26d ago
That card only had 16GB of memory; its memory bandwidth was 1TB/s.
[−] whatsupdog 25d ago
Supply and demand. The prices are high because of high demand.
[−] sidewndr46 25d ago
It also does 64 bit floating point I think?
[−] nektro 24d ago
the hardware dump once we're through the bubble is gonna be wild
[−] LNSY 25d ago
[flagged]
[−] Cthulhu_ 26d ago
To add a more local hurdle as well, the Dutch power grid is at capacity and its managing company is now telling companies that planned to build a datacenter that they can't be connected to the grid until 2030, even though said companies already paid for and got guarantees about that connection.

That is, memory capacity is reserved for datacenters yet to be built, but this will do weird things if said datacenter construction is postponed or cancelled altogether.

[−] xbmcuser 26d ago
I am betting the pendulum swings faster to the other side to excess capacity as all the construction lies of Altman fall through with financiers waking up the the fact they can't build the infrasctructure as fast nor make any profits on that infrastructure that will get built.
[−] bombcar 25d ago
Don’t the memory makers always get left holding the bag? I feel this has happened at least three times before.
[−] Macha 25d ago

> Soo ... how sure are we that the memory makers themselves are not going to be the ones holding the bag?

The memory makers specifically did not scale up capacity to avoid being left holding the bag.

[−] moffkalast 26d ago
Memory makers did get themselves into this situation by selling all wafers for empty promises and alienating everyone but OpenAI tbh. I do hope they end up holding the bag once again, cause after covid and the cartel thing they don't seem to ever learn their lesson on how to have the tiniest amount of integrity.
[−] quickthrowman 25d ago

> Soo ... how sure are we that the memory makers themselves are not going to be the ones holding the bag?

I hope they do, they did not have to agree to sell so much RAM to one customer. They’ve been caught colluding and price fixing more than once, I hope they take it in the shorts and new competitors arise or they go bankrupt and new management takes over the existing plants.

Don’t put all your eggs in the one basket is how the old saying goes.

[−] mschuster91 26d ago

> Soo ... how sure are we that the memory makers themselves are not going to be the ones holding the bag?

We aren't. The remaining memory manufacturers fear getting caught in a "pork cycle" yet again - that is why there's only the three large ones left anyway.

[−] naveen99 26d ago
But wouldn’t you rather hbm prices come down first ? Memory makers will be fine. There is practically infinite demand. Unless you get china style rationing of compute per person world wide.

The real issue is everyone wanting to upgrade to hbm, ddr5, and nvme5 at the same time.

[−] nostrademons 25d ago
What kind of consumer electronics can you build with HBM? That's the startup you should be founding...
[−] PunchyHamster 25d ago

> Based on my limited knowledge HBM can not be easily repurposed for consumer electronics. Translation: main street is cooked for the next 3-4 years.

It's worse. HBM have lower yields so they are essentially making less GB per wafer too

[−] zamalek 25d ago
Not a rec, but just my source: Atrioc (streamer, YouTuber) is good at gathering all the facts for the rest of us. There's many other things in play, like the Strait of Hormuz (helium, bromine). Ultimately it works out that the shortage, and shortage profits, will continue; the chip makers are probably going to continue to see record profits (as Samsung has).

The specific mix of factors could change at any time, but the supply chain is relatively inelastic, it will take some time to show up on price labels.

[−] eldenring 25d ago

> Folks are now starting to ask difficult questions about their burn rate and revenue.

this view isn't updated correctly post-claude code and codex. there will clearly be sufficient demand.

[−] pseudohadamard 25d ago
It's not going to last until 2028, it'll last until 'min(AI_bubble_burst, 2028)', which I expect will be a lot smaller than just '2028'. So the real question is, how long will it take to retool for non-HBM, and will there be a fire sale as they scramble to recover?

Which also explains why production is falling behind demand, companies aren't going to sink billions into creating product for a market that could dry up overnight.

[−] danishanish 25d ago
I think I’m missing something. Financially, what bag would the memory makers be holding here? I don’t think I’m well informed regarding how these deals were structured.
[−] zozbot234 25d ago
There's actually plenty of demand for LPDDR even in the AI datacenter, because HBM is quite wasteful of area for any given memory capacity.
[−] dpoloncsak 23d ago
I'm under the impression OAI wrote Letters of Intent, and are not actually on the hook for the RAM they requested.

The others...they did not. Memory makers won't be holding the bag because Apple/Google/Samsung are contractually obligated to purchase, after the panic OAI caused.

[−] elorant 25d ago
The market is already stagnated. Even if OpenAI doesn’t buy what they reserved other players will do so. SK Hynix CEO said there is a 20% gap between supply and demand per year. And that doesn’t account the shock effect that will take place the moment prices normalize and everyone and their dog will go out and start buying inventory to avoid the next crisis. I for one would certainly buy more than I currently need just in case.
[−] diego_sandoval 25d ago
Did OpenAI's order actually kickstart the "panic" over RAM?

Are they really such a big RAM buyer?

[−] metalcrow 25d ago
Do the memory makers not have a contract in place for an order this large? I assume that they aren't going to take "trust us bro" as good enough for several million dollars in orders, and even if there is a way to cancel the order it won't be free. I would assume so at least, but i would like if anyone knew for certain.
[−] shevy-java 25d ago
Good point. I think both AI companies and hardware makers should pay for the damage they caused to us here.

They act as a de-facto monopoly and milk us. Why is this allowed?

[−] Se_ba 25d ago
not all DRAM capacity can switch to HBM quickly. That lag is where volatility comes from
[−] Rekindle8090 25d ago
This will result in demand destruction which will starve the enterprise which will starve the hyperscaler. theres no situation where people not being able to afford hardware for 4 years results in the bubble not popping
[−] hsbauauvhabzb 26d ago
The people who fucked over consumers are left holding the back that they sold us out over?

Oh no!

[−] kubb 26d ago
I would expect that OpenAI gets as much money as they ask for for the next 10 years.

There’s virtually infinite capital: if needed, more can be reallocated from the federal government (funded with debt), from public companies (funded with people’s retirement funds), from people’s pockets via wealth redistribution upwards, from offshore investment.

They will be allowed to strangle any part of the supply chain they want.

[−] fouc 26d ago
I'm a bit surprised the article makes no mention of Google's TurboQuant[0] introduced 26 days prior.

Given that TurboQuant results in a 6x reduction in memory usage for KV caches and up to 8x boost in speed, this optimization is already showing up in llama.cpp, enabling significantly bigger contexts without having to run a smaller model to fit it all in memory.

Some people thought it might significantly improve the RAM situation, though I remain a bit skeptical - the demand is probably still larger than the reduction turboquant brings.

[0] https://news.ycombinator.com/item?id=47513475

[−] LastTrain 25d ago
Something I haven’t been able to reconcile: If AI makes software easier to create, that will drive the price down. How are software companies going to make enough revenue to pay for AI, when the amount of money being spent on AI is already multiples of the current total global expenditure on software? This demand for RAM is built on a foundation of sand, there will be a glut of capacity when it all shakes out.
[−] cbdevidal 25d ago
I’m a bit of an optimist. I think this will smack the hands of developers who don’t manage RAM well and future apps will necessarily be more memory-efficient.
[−] chintech2 26d ago
I'm a bit surprised the article makes no mention of China's new memory companies.

[0] https://techwireasia.com/2026/04/chinese-memory-chips-ymtc-c...

[−] tim-projects 26d ago
The era of optimisation is finally here. I'm excited.
[−] nilkn 25d ago
As an aside, recently I wanted to refresh my gaming PC, but the price shock and general lack of availability of buying components individually made it seem hardly worth it, so I just kept deferring the project.

Then, mostly by chance, I saw that my local Microcenter had some pre-builts for sale, and I ended up picking one up for <$5k that had "best in slot" components across the board, including a 5090 and even a high-end power supply.

The last time I built a gaming PC was upwards of a decade ago, and at that time the prevailing wisdom was to never buy a pre-built unless you had a massive amount of disposable income and couldn't spare even just one weekend to dedicate to a hobby project that could benefit you for years. Now, it was absolutely a no-brainer.

[−] rzmmm 26d ago
It seems that RAM manufacturers are still reluctant to increase production. They know something that investors don't about long term RAM demands?
[−] senfiaj 25d ago
I wonder if this might motivate to write more memory efficient software. I mean we have so much memory, but even some trivial programs eat hundreds of megabytes of ram.
[−] WesolyKubeczek 26d ago
I fear that the real reason we do have a shortage, I mean, the real reason for the demand, is AI companies scooping what they can so that their competitors, whether existing or incumbent, can’t get to it.
[−] thelastgallon 25d ago
It will last forever. After covid, all manufacturers understood the value of limiting supply and extracting profits. Cars used to super cheap before covid, they will never go back to the same levels.

From now on, RAM will always be super costly for consumers, because they can't make massive deals like Apple/OpenAI/etc. We are the bagholders.

[−] lousken 26d ago
If only we have not allowed oligopolies to exist. Meanwhile, EU is not in the race at all and US has very few fabs.
[−] fsckboy 25d ago
if a shortage lasts years, it's not a shortage. "The market clearing price of RAM in the face of expected sustained healthy demand should lead to a stable market for years."

even if gaming is and will remain very popular for years, it and the desire to upgrade gaming rigs is still a discretionary activity with more price elasticity of demand than corporate uses for RAM in the dawn of the AI age. gamers live on the margin of this market, where low prices will stimulate upgrades and high prices will lead to holding out. The complaints about price are real, but that segment of the market is some combination of less large and less important.

[−] thijson 25d ago
I've read that the chip manufacturers are looking into high bandwidth flash for on package storage of ai models. That would solve some of the cost issue, flash is significantly cheaper than dram.
[−] Hamuko 26d ago
I'm personally hoping that one of the AI or data center companies is suddenly unable to pay for their bills and deflate the entire industry. Probably the only hope of things getting better before the 2030s.
[−] BirAdam 25d ago
Of course, alternatively, the AI companies could go bust before finding profitability. Then, there’d be a ton of supply, prices would crash, and one or two of the current memory suppliers would go out of business. After that, the new Chinese memory companies might be producing at volume, and Renesas could be up and running.

At the moment, nothing is certain. Could this last? Sure. Could it not last? Yup.

[−] 1o1o1o1o1 25d ago
Hilarious, The ram in my PC i built 5 years ago is will soon be worth more than i spent on building the whole PC.
[−] librasteve 25d ago
The RAM market is a square wave
[−] rldjbpin 25d ago
let the analyst and news say what they want - the entire situation is artificial and is up to the manufacturers.

the current relative spike in the prices misses the medium-term trend of the vast decrease in memory price post-covid that led to the recent surge. the cartel got another opportunity to make bank and they will use that lever to the max.

funnily enough i've been personally stuck with 16 gigs since 2015, across three memory generations! but i am used to the past when you would spend 80-100 on an 8gb stick (jdec timings, nothing fancy but from a major brand) without accounting for inflation.

[−] p0w3n3d 25d ago
We have the saying in my country: The days of things being cheap are over.
[−] jmyeet 25d ago
This is simple extrapolation from current demand, nothing more. And that's a borderline silly analysis because it assumes the AI bubble won't burst. The great misadventure in the Persian Gulf probably accelerates that because we're almost certainly going to be facing a recession.

Another thing I've been thinking about is what happens when the next generation of NVidia chips comes out? I suspect NVidia is going to delay this to milk the current demand but at some point you'll be able to buy something that's better than the H100 or B200 or whatever the current state-of-the-art for half the price. And what's that going to do to the trillions in AI DC investment?

I'm interested when the next bump in DRAM chip density is coming. That's going to change things although it seems like much of production has moved from consumer DRAM chips to HBM chips. So maybe that won't help at all.

I do think that companies will start seeing little ot no return from billions spent on AI and that's going to be aproblem. I also think that the hudnreds of billions of capital expenditure of OpenAI is going to come crashing down as there just isn't any even theoretical future revenue that can pay for all that.

[−] onchainintel 25d ago
Your instincts are likely right on this one OP. Memory prices surged 80–90% in Q1 2026 compared to Q4 2025, DRAM, NAND, and HBM all at record highs. 3 suppliers for the entire planet?
[−] vectorhacker 25d ago
It sounds to me like an incentive for new companies to make RAM.
[−] shevy-java 25d ago
I want those AI companies that drove the prices up, to pay an immediate back-tax to all of us.

I don't want to pay more because of AI companies driving the price up. That is milking.

[−] ares623 25d ago
Are we entering the Reverse-Moore's Law era.
[−] tomaytotomato 26d ago
I just checked my gaming PC I built a few years ago with 64GB of DDR5 RAM, its actually gone up in value, that is unheard of generally.

Think I will scrap my PC and sell its parts.

I wonder if there are any niche companies building decent rigs with DDR3 and 5/6th generation Intel CPUs out there, it is cheap and might be a business opportunity?

[−] alprado50 25d ago
Im thankfull for buying 16gb of RAM, but what is gonna happen in 5 years when users PCs start to fail?
[−] cozzyd 25d ago
I bought a workstation with 3 TB of ram for FDTD simulations last year. Glad I got it then ...
[−] cicko 24d ago
Usually, right after articles like this, things come crashing down.
[−] cylemons 25d ago
When I personally use chatgpt and friends, I am not seeing any slowdowns or anything, meaning that their servers can handle the loads just fine. So then, why are these companies spending so much building new capacity if the current capacity is enough?
[−] WhereIsTheTruth 25d ago
Fabricated shortage to fasten US Chip Act and US Chip Security Act
[−] zizheruan 25d ago
Sad news, I didn't buy enough RAM before....
[−] 5255652 25d ago
Can we stop advertising paid Blog/News websites we can't read without a subscription.
[−] Gud 26d ago
Thank god they shut down 3D XPoint.
[−] bschwindHN 25d ago
But thank god we were all able to generate some SVGs of pelicans, right guys?
[−] marcus_holmes 25d ago
This could be great.

There's a future where RAM makers tool up for this massively increased demand, then the AI companies go broke as the bubble bursts, so RAM is cheap as. So laptop manufacturers get on that and start making laptops with 1TB+ memory so we can run decent LLMs on the local machine. Everyone happy :)

[−] ochre-ogre 26d ago
can't read the article due to a paywall.
[−] Chrisszz 25d ago
[dead]
[−] black_13 25d ago
[dead]
[−] coldtea 25d ago
Expect shortages across the board. RAM? That's the tip of the iceberg, think food and gas.
[−] sph 26d ago
I fear the author and most commenters are not aware of the law of demand and supply. If there is demand for consumer RAM, there will be supply for consumer RAM. It just takes time and risk-assessment to scale up operations.

We have RAM shortage now, we will have very cheap RAM tomorrow. It’s not like production is bottlenecked by raw materials. Chip companies just need to assess if the demand by AI companies will last so it’s better to scale up, or perhaps they should wait it out instead of oversupplying and cutting into their profits.