I don't buy the central thesis of the article. We won't be in a supply crunch forever.
However, I do believe that we're at an inflection point where DC hardware is diverging rapidly from consumer compute.
Most consumers are using laptops and laptops are not keeping pace with where the frontier is in a singular compute node. Laptops are increasingly just clients for someone else's compute that you rent, or buy a time slice with your eyeballs, much like smartphones pretty much always have been.
I personally dropped $20k on a high end desktop - 768G of RAM, 96 cores, 96 GB Blackwell GPU - last October, before RAM prices spiked, based on the logic that hardware had moved on but local compute was basically stagnant, and if I wanted to own my computing hardware, I'd better buy something now that will last a while.
This way, my laptop is just a disposable client for my real workstation, a Tailscale connection away, and I'm free to do whatever I like with it.
I could sell the RAM alone now for the price I paid for it.
We won't be in a supply crunch forever. We'll have a demand crunch. The demand of powerful consumer hardware will shrink so much that producing them will lose the economics of scale. It 've always been bound to happen, just delayed by the trend of pursuing realistic graphics for games.
People who are willing to drop $20k on a computer might not be affected much tho.
I personally dropped $20k on a high end desktop - 768G of RAM, 96 cores, 96 GB Blackwell GPU - last October, before RAM prices spiked […]
768GB of RAM is insane…
Meanwhile, I’ve been going back and forth for over a year about spending $10k on a MacBook Pro with 128GB. I can’t shake the feeling I’d never actually use that much, and that, long term, cloud compute is going to matter more than sinking money into a single, non-upgradable machine anyway.
We are on borrowed time, most of the world is running on oil and this resource is not unlimited at all. A lot of countries have gone past their production peak, meaning it's only downhill from here. Everything is gonna be more costly, more expensive, our lavish "democracies" lifestyles are only possible because we have (had) this amazing freely available resource, but without it it's gonna change. Even at a geopolitical scale you can see this pretty obviously, countries that talked about free market, free exchange are now starting to close the doors and play individually. Anyways, my point is, we are in for decades, if not a century of slow decline.
"I personally dropped $20k on a high end desktop . . . "
This is where I think current hackers should be headed. I grew up with lots of family who were backyard mechanics, wrenching on cars and motorcycles. Their investment in tools made my occasional PC purchase look extremely affordable. Based on what I read, senior mechanics often have five-figure US dollar investments in tools. Of course, I guess high quality torque wrenches probably outlast current GPU chips? I'd hate to be stuck making a $10K investment every 24 months on a new GPU . . .
I have been renting GPU resources and running open weight models, but recently my preferred provider simply doesn't have hardware available. I'm now kicking myself a little for not simply making a big purchase last fall when prices were better.
> Most consumers are using laptops and laptops are not keeping pace with where the frontier is in a singular compute node.
How can you say this when Apple is releasing extremely fast M5 MacBook Pros? Or the $600 MacBook Neo that has incredible performance for that price point?
Even x86 is getting some interesting options. The Strix Halo platform has become popular with LLM users that the parts are being sold in high numbers for little desktop systems.
It seems like you largely agree with the article - people shall own nothing and be happy. Perhaps the artificially induced supply crunch could go on indefinitely.
Also, I wonder how many of us, even here on HN, have the ability to spend that amount of money on computer for personal use. Frankly I wouldn't even know what to do with all the RAM - should I just ramdisk every program I use and every digital thing I made in the last five years?
Anyhow, I suppose for the folks who can't afford hardware (perhaps by design), one ought to own nothing and be happy.
> Most consumers are using laptops and laptops are not keeping pace with where the frontier is in a singular compute node. Laptops are increasingly just clients for someone else's compute that you rent, or buy a time slice with your eyeballs, much like smartphones pretty much always have been.
It really feels like we're slowly marching back to the era of mainframe computers and dumb terminals. Maybe the democratization of hardware was a temporary aberration.
I believe superficially speaking you could be right. But I think it was realised that causing the scarcity of products and commodities is a power move.
We live in world where we optimised for globalization. Industry in china, oil in middle east, etc...
This approach proved to be fragile on the hands of people with money and/or power enough to tilt the scale
I think you're probably right, but I'm not so confident the supply crunch will end.
Tech feels increasingly fragile with more and more consolidation. We have a huge chunk of advanced chip manufacturing situated on a tiny island off the coast of a rising superpower that hates that island being independent. Fabs in general are so expensive that you need a huge market to justify building one. That market is there, for now. But it doesn't seem like there's much redundancy. If there's an economic shock, like, I dunno, 20% of the world's oil supply suddenly being blockaded, I worry that could tip things into a death spiral instead.
>we're at an inflection point where DC hardware is diverging rapidly from consumer compute.
I thought the trend is the opposite direction, with RTX 5x series converging with server atchitectures (Blackwell-based such as RTX 6000 Pro+). Just less VRAM and fewer tensor cores, artificially.
Where is the divergence happening? Or you don't view RTX 5x as consumer hardware?
I upgraded my desktop last year (motherboard, cpu, RAM) and I felt like I wanted 64GB of DDR5 but figured I might need 128GB in a year or so. Normally, I would have bought the 64GB and waited to get the extra RAM later. Price usually dropped over time.
Boy, am I glad I decided to get the whole 128GB before RAM prices spiked!
This will be me. Bestowing upon my descendants a collection of Mighty Beanz, a few unkillable appliances, and the best consumer computing hardware the early 2020s could buy.
And I fear they will be equally confused and annoyed by disposing of all of them.
I don't share the same 1:1 opinion with regards to the article,
but it is absolutely clear that RAM prices have gone up enormously.
Just compare them. That is fact.
It may be cheaper lateron, but ... when will that happen? Is there
a guarantee? Supply crunch can also mean that fewer people can
afford something because the prices are now much higher than before.
Add to this the oil crisis Trump started and we are now suddenly
having to pay more just because a few mafiosi benefit from this.
(See Krugman's analysis of the recent stock market flow of money/stocks.)
> Laptops are increasingly just clients for someone else's compute that you rent, or buy a time slice with your eyeballs, much like smartphones pretty much always have been.
What are you talking about?
My laptops are, and always have been, primarily places where I do local computing. I write code there, I watch movies there, I listen to music there, I play games there...all with local storage, local compute, and local control (though I do also store a bunch of my movies on a personal media server, housed in my TV stand, because it can hold a lot more). My smartphone is similar.
If you think that the vast majority of the work most people do on their personal computers is moving to LLMs, or cloud gaming, then I think you are operating in a pretty serious bubble. 99.9% of all work that most people do is still best done locally: word processing, spreadsheets, email, writing code, etc. Even in the cases where the application is hosted online (like Google Docs/Sheets), the compute is still primarily local.
The closest to what you're describing that I think makes any sense is the proliferation of streaming media—but again, while they store the vast libraries of content for us, the decoding is done locally, after the content has reached our devices.
It doesn't matter if a cutting-edge AI-optimized server can perform 10, 100, or 1000 times better than my laptop at any particular task: if the speed at which my laptop performs it is faster than I, as a human, can keep up (whatever that means for the particular task), then there's no reason not to do the task locally.
The general take here seems to be "everything eventually passes". That isn't always true. I wonder how many people have a primary computing device that they don't even have full control over now (Apple phones, tablets...). Years ago the concept of spending over $1k on a computer that I didn't even have the right to install my own software on was considered ridiculous by many people (myself included). Now many people primarily consume content on a device controlled almost entirely by the company they bought it from. If the economics lead to a situation where its more profitable to sell you compute time than sell you computers then businesses will chose to not sell you computers. I have no idea if that is what ends up happening.
A long article begging the question when the last paragraph or two countered the panic of the beginning. Two Chinese firms are ramping up production of consumer RAM/SSDs because they see a market opening as the existing producers move to selling to enterprise/hyperscalars.
There have been memory chip panics before, the US funded RAM production back into the 80s/90s in competition with Japan at the time.
The AI boom/"hyperscale" currently is almost exactly like the dotcom boom.
It's already starting to shake down. Anthropic is occupying the developer space, OpenAI has just exited the video/media production space. More focused and vertical market AI is emerging.
The current vortice of money between OpenAI <-> Microsoft <-> Oracle <-> NVidea <-> Google <-> etc etc is going to break.
This may not be entirely appropriate to the reasons behind the article, but it feels tangentially related:
I'd like to say a brief thank you to what the brief, golden period of globalisation was able to bring us.
I hope that that level of international trade and economic cooperation across geographical, ideological, political, and religious boundaries can be achieved again at some point in the future, but it seems the pendulum is swinging the other way for the time being.
I hope that, wherever the current direction ends up, there are lessons that can be learnt about what we had, and somehow fumbled, such that there is motivation enough to get back there.
Oh man, I've come across this person's blog before and I love it, not just because of the personalization/personality they've put into the site's design, but because of all of the random CLI/TUI-based tools they've developed. Examples:
Just to mention one thing, helium -which is a necessity for chip production- is a byproduct of LNG production. And 20% of that is just gone (Qatar) and the question is how long it will take to get that back. So not only a chip shortage because of AI buying chips in huge volumes but also because production will be hampered.
Tongue in cheek: we urgently need fusion power plants. For the AI and the helium.
This article inspired me to look and see what this computer is. Apparently it is a "AMD Athlon(tm) II X2 250 Processor" from 2009. So 17 years old. It has 8 GB of DDR3 memory and runs at 3 GHz. It currently has OpenBSD on it, but at least one source thinks it could run Windows 10.
The fact that I didn't know any of this is what is significant here. At some point I stopped caring about this sort of thing. It really doesn't matter any more. Don't get my wrong, I am as nerdy as they come. My first computer was a wire wrapped 8080 based system. That was followed by an also wire wrapped 8086 based system of my own design I used for day to day computing tasks (it ran Forth). If someone like me can get to the point of not caring there is no real reason for anyone else to care.
The article's dystopia section is dramatic but the practical point is real. I've been self-hosting more and more over the past year specifically because I got uncomfortable with how much of my stack depended on someone else's
servers.
Running a VPS with Tailscale for private access, SQLite instead of
managed databases, flat files synced with git instead of cloud storage. None
of this requires expensive hardware, it just requires caring enough to set it up
Articles entire thesis looks like it can be completely de-railed if one activity happened: ai infrastructure firms cease to be able to secure more capital.
Is that likely? History says it's inevitable, but timeframe is an open question.
As long as there are consumers paying for hardware ownership there will be businesses willing to sell it to them. The worst scenario I could imagine is that one has to pay a premium for fully-owned hardware simply because consumer's desire for it becomes an oddity and it is thus sold in low quantities.
The current AI-induced shortages aside, the times have never been better in my opinion. There is overwhelming choice; ordinary consumers can access anything from Raspberry PIs all the way up to enterprise servers and AI accelerators. The situation was very different in the 1990s when I built my first PC.
I do not see this from an infinite shortage point; I see this from a locked down hardware point. Old hardware is hackable, new hardware mostly not. That is for me where the real pain is and why I just buy old computers and phones that are rootable.
In the last month 20-30% of oil supply 30% gas supply and 30-40% of fertilizer production has been destroyed and could take any where from 8 months to 5 years to come back online. Governments are acting as everything is okay so that there is no panic but we have crossed the point of no return even if the war ends today food & energy shortages are over the horizon.
If you can get an ev, solar heat pumps, battery storage etc get it now today as fossil fuel based energy prices are going to go through the roof. I see similarities to when covid hit people kept looking at things happening in other countries and not preparing for the shit to hit their own cities and countries.
In such a future the iPhone and android ecosystem is dead? Because a single $1k phone is a hell of a computer. So if you can still buy a phone you can still get a computer. Local AI aside these are very capable.
To the people saying "The shortage wont last forever." - Yes, you might be right. However, such a supply crunch creates a perfect vacuum for rapidly change to fill in the vacuous hardware landscape of computing and shift the balance of power.
Think about it like this: Imagine the AI/Cloud/Crypto companies who are buying up all these compute and storage resources realize they now control the compute hardware market becoming compute lords. What happens when joe/jane six pack or company xyz needs a new PC or two thousand but cant afford them due to the supply crunch? Once the compute lords realize they control the compute supply they will move to rent you their compute trapping users in a walled garden. And the users wont care because they aren't computer enthusiasts like many of us here. They only need a tool that works. They *do not* care about the details.
They hardware lords could further this by building proprietary hardware in collusion with the vendors they have exclusivity with to build weaker terminal devices with just enough local ram and storage to connect to a remote compute cluster. Hardware shortage solved!
All they need to do is collude with the hardware makers with circular contracts to keep buying hardware in "anticipation of the AI driven cloud compute boom." The hardware demand cycle is kept up and consumers are purposefully kept out of the market to push people into walled gardens.
This is unsustainable of course and will eventually fall over but it could tie up computing resources for well over a decade as compute lords dry up the consumer hardware market pushing people to use their hoarded compute resources instead of owning your own. We are in a period where computing serfdom could be a likely outcome that could cause a lot of damage to freedom of use and hardware availability and the future ability to use the internet freely.
Hold onto your hardware. Hold on to your existing software and the current version. Don’t upgrade without a specific need. None of the “progress” is actually helpful to hackers and I’m not sure it’s even helpful to typical users. There’s enough information being given to and slurped by others, don’t make it more effective.
> For the better part of two decades, consumers lived in a golden age of tech. Memory got cheaper, storage increased in capacity and hardware got faster and absurdly affordable.
I got my first PC circa 1992 (a 2nd hand IBM PS/2, 80286 processor with 2MB RAM and 30MB HDD) and the "golden age" was already there. We are well over 40 years of almost uninterrupted "pay less for more performances" in the home/personal computing space, and that's because that space started around 50 years ago. There was some fluctuation (remember the earthquake affecting HDD prices a few years ago?) but demand was there and manufacturing tech became more efficient.
The actual important change is that for most consumer uses, the perf improvements stopped to make sense already what, over 10 years ago?
I actually think the central thesis is thought provoking, we have shifted far away from locally installed shit to remote data centre access, this was initially driven by cloud-based initiatives and now spiralling upwards by AI. For any researchers, hackers, builders wanting to play with locally installed AI, hardware could become a bottleneck especially as many machines, such as the beloved Macs, are not upgradable
when you click away to another tab, the title and favicon of the page changes to something weird, but really legit looking.
a couple of my favorites: "rust programming socks - Google", "Amazon.com: waifu pillow", "Rick Astley - Never Gonna Give You Up", "censorship on hacker news - Google"
When I started programming in the early 80's, personal computing had just recently become a thing. Before that, if you wanted to learn to program, you first needed access to a very rare piece of hardware that only a select few were granted access to. But when personal computing became a reality, programming exploded - anybody could learn it with a modest investment.
I suspect we're trending back to the pre-personal computing era where access to 'raw' computing power will be hard to come by. It will become harder and harder to learn to program just because it'll be harder and harder to get your hands on the necessary equipment.
It is wild thinking how a few years ago, I didn't buy a 4090 direct from nvidia because "$1600 (USD) is too much to pay for a graphics card; if I need a better one, i'll upgrade in a few years. (Went with 4080, which is substantially slower and was $1200) Joke's on me!
It will be scarcity mindset from here on out; will always buy the top tier thing .
I have often imagined writing a book, roughly "Fahrenheit 451 but with computers instead of books". Imagine a world you do not buy an iPhone- one is assigned to you at birth, a world were "installing software" on "a computer you own" are not just antiquated or taboo, but unthinkable.
I've seen comments on here before that went somewhere along the line of "adults don't care about RAM prices." HN is no stranger to siding with the oppressors.
I grabbed an upgrade at the end of last year because my ~10 year old workhorse is starting to show signs of aging. Despite 16 gigs of RAM having lasted me thus far I decided to bite the bullet and get 32; so I expect this new machine to last me another 10 years (although I now have a full SSD, whereas my old workhorse had an SSD for the OS and a hybrid drive for /home, so we'll see whether or not it will actually last).
Oh bubbles... their so bubbly. Remember when there was an unlimited demand for fibre optics because - The Internet?
So Nortel and other manufacturers lent the money to their clients building the Internet because the growth was unlimited forever? Except they actually didn't have any money, just stock valuations?
"This is a critical step in our effort to unleash the full potential of our high-performance optical component solutions business," said Clarence Chandran, COO of Nortel Networks. "This acquisition really strengthens Nortel Networks' leadership position in high-performance optical components and modules which are essential to delivering the all-optical Internet."
Not to mention Age Verification / KYC being baked into every future OS and device. Buy and hodl to have a hope of independent, censorship-resistant computing in the future.
We are in a renaissance of computing right at this moment. If expand our definition of computers outside of screens and traditional input devices, microcontrollers are capable of so much more, with so much less (energy consumption | ram | storage).
The tipping point for MCUs was WiFi - which not only allows you to speak multiple protocols (UDP/Zigbee/HTTP/etc) and have audio IO, but also P2P communication and novel new form factors. There's been incredible progress with the miniaturisation of sensors and how we're able to understand and perceive our environment.
So yes, whilst traditional hardware is getting more expensive and locked down, there's a strong counter movement towards computing for everyone - and by that I also mean that there's going to be less abstraction in the entire stack. Good times ahead!
It's not that I disagree with the basic premise and concern of the text, but I'm not convinced about the "RAM shortage will lead to thin clients" argument, because the thin client is going to be a browser.
Everything today is a web app. If it doesn't exist and you want to vibe code it? It's probably going to become a web app, vibed using a web app.
The problem is, web apps are stupendous memory hogs. We're even seeing Chromebooks with 8 gigs of RAM now. LLM:s are all trained for and implemented in apps assuming the user can have $infinity browsers running, whether it's on their PC or on their phone. It's going to be very hard to change that in a way that's beneficial to what passes for business models at AI companies.
535 comments
However, I do believe that we're at an inflection point where DC hardware is diverging rapidly from consumer compute.
Most consumers are using laptops and laptops are not keeping pace with where the frontier is in a singular compute node. Laptops are increasingly just clients for someone else's compute that you rent, or buy a time slice with your eyeballs, much like smartphones pretty much always have been.
I personally dropped $20k on a high end desktop - 768G of RAM, 96 cores, 96 GB Blackwell GPU - last October, before RAM prices spiked, based on the logic that hardware had moved on but local compute was basically stagnant, and if I wanted to own my computing hardware, I'd better buy something now that will last a while.
This way, my laptop is just a disposable client for my real workstation, a Tailscale connection away, and I'm free to do whatever I like with it.
I could sell the RAM alone now for the price I paid for it.
People who are willing to drop $20k on a computer might not be affected much tho.
My phone has 16gigs of ram and a terabyte of storage, laptops today are ridiculous compared to anything I studied with.
I'm not arguing mind you, just trying to understand the usecases people are thinking of here.
>
"I personally dropped $20k on a high end desktop"This absolutely boggles my mind. Do you mind if I ask what type of computing you do in order to justify this purchase to yourself?
>
I personally dropped $20k on a high end desktop - 768G of RAM, 96 cores, 96 GB Blackwell GPU - last October, before RAM prices spiked […]768GB of RAM is insane…
Meanwhile, I’ve been going back and forth for over a year about spending $10k on a MacBook Pro with 128GB. I can’t shake the feeling I’d never actually use that much, and that, long term, cloud compute is going to matter more than sinking money into a single, non-upgradable machine anyway.
Main-frame (thin) -> PC (fat) -> Internet/Cloud (thin) -> Mobile (fat) -> AI (thin)
I expect this to continue until the next technology transition.
In each of these shifts, and there have been others, things are not completely fat or thin, more of an in-between state but leaning to local vs cloud.
This is where I think current hackers should be headed. I grew up with lots of family who were backyard mechanics, wrenching on cars and motorcycles. Their investment in tools made my occasional PC purchase look extremely affordable. Based on what I read, senior mechanics often have five-figure US dollar investments in tools. Of course, I guess high quality torque wrenches probably outlast current GPU chips? I'd hate to be stuck making a $10K investment every 24 months on a new GPU . . .
I have been renting GPU resources and running open weight models, but recently my preferred provider simply doesn't have hardware available. I'm now kicking myself a little for not simply making a big purchase last fall when prices were better.
> Most consumers are using laptops and laptops are not keeping pace with where the frontier is in a singular compute node.
How can you say this when Apple is releasing extremely fast M5 MacBook Pros? Or the $600 MacBook Neo that has incredible performance for that price point?
Even x86 is getting some interesting options. The Strix Halo platform has become popular with LLM users that the parts are being sold in high numbers for little desktop systems.
Also, I wonder how many of us, even here on HN, have the ability to spend that amount of money on computer for personal use. Frankly I wouldn't even know what to do with all the RAM - should I just ramdisk every program I use and every digital thing I made in the last five years?
Anyhow, I suppose for the folks who can't afford hardware (perhaps by design), one ought to own nothing and be happy.
> Most consumers are using laptops and laptops are not keeping pace with where the frontier is in a singular compute node. Laptops are increasingly just clients for someone else's compute that you rent, or buy a time slice with your eyeballs, much like smartphones pretty much always have been.
It really feels like we're slowly marching back to the era of mainframe computers and dumb terminals. Maybe the democratization of hardware was a temporary aberration.
We live in world where we optimised for globalization. Industry in china, oil in middle east, etc...
This approach proved to be fragile on the hands of people with money and/or power enough to tilt the scale
Tech feels increasingly fragile with more and more consolidation. We have a huge chunk of advanced chip manufacturing situated on a tiny island off the coast of a rising superpower that hates that island being independent. Fabs in general are so expensive that you need a huge market to justify building one. That market is there, for now. But it doesn't seem like there's much redundancy. If there's an economic shock, like, I dunno, 20% of the world's oil supply suddenly being blockaded, I worry that could tip things into a death spiral instead.
>we're at an inflection point where DC hardware is diverging rapidly from consumer compute.
I thought the trend is the opposite direction, with RTX 5x series converging with server atchitectures (Blackwell-based such as RTX 6000 Pro+). Just less VRAM and fewer tensor cores, artificially.
Where is the divergence happening? Or you don't view RTX 5x as consumer hardware?
Boy, am I glad I decided to get the whole 128GB before RAM prices spiked!
And I fear they will be equally confused and annoyed by disposing of all of them.
> Laptops are increasingly just clients for someone else's compute
Are you kidding? Apple's mobile chips are now delivering perf that AMD & intel desktop never could or did.
> We won't be in a supply crunch forever.
I don't share the same 1:1 opinion with regards to the article, but it is absolutely clear that RAM prices have gone up enormously. Just compare them. That is fact.
It may be cheaper lateron, but ... when will that happen? Is there a guarantee? Supply crunch can also mean that fewer people can afford something because the prices are now much higher than before. Add to this the oil crisis Trump started and we are now suddenly having to pay more just because a few mafiosi benefit from this. (See Krugman's analysis of the recent stock market flow of money/stocks.)
> Laptops are increasingly just clients for someone else's compute that you rent, or buy a time slice with your eyeballs, much like smartphones pretty much always have been.
What are you talking about?
My laptops are, and always have been, primarily places where I do local computing. I write code there, I watch movies there, I listen to music there, I play games there...all with local storage, local compute, and local control (though I do also store a bunch of my movies on a personal media server, housed in my TV stand, because it can hold a lot more). My smartphone is similar.
If you think that the vast majority of the work most people do on their personal computers is moving to LLMs, or cloud gaming, then I think you are operating in a pretty serious bubble. 99.9% of all work that most people do is still best done locally: word processing, spreadsheets, email, writing code, etc. Even in the cases where the application is hosted online (like Google Docs/Sheets), the compute is still primarily local.
The closest to what you're describing that I think makes any sense is the proliferation of streaming media—but again, while they store the vast libraries of content for us, the decoding is done locally, after the content has reached our devices.
It doesn't matter if a cutting-edge AI-optimized server can perform 10, 100, or 1000 times better than my laptop at any particular task: if the speed at which my laptop performs it is faster than I, as a human, can keep up (whatever that means for the particular task), then there's no reason not to do the task locally.
There have been memory chip panics before, the US funded RAM production back into the 80s/90s in competition with Japan at the time.
The AI boom/"hyperscale" currently is almost exactly like the dotcom boom.
It's already starting to shake down. Anthropic is occupying the developer space, OpenAI has just exited the video/media production space. More focused and vertical market AI is emerging.
The current vortice of money between OpenAI <-> Microsoft <-> Oracle <-> NVidea <-> Google <-> etc etc is going to break.
I'd like to say a brief thank you to what the brief, golden period of globalisation was able to bring us.
I hope that that level of international trade and economic cooperation across geographical, ideological, political, and religious boundaries can be achieved again at some point in the future, but it seems the pendulum is swinging the other way for the time being.
I hope that, wherever the current direction ends up, there are lessons that can be learnt about what we had, and somehow fumbled, such that there is motivation enough to get back there.
Maybe... just maybe, a TODO list app shouldn't run 4 processes, and consume hundreds of megabytes of RAM?
- https://xn--gckvb8fzb.com/projects/
Their github repos:
- https://github.com/mrusme
They even built a BBS-style reader client that supports Hacker News:
https://github.com/mrusme/neonmodem
I miss the days of the web being weird like this :-)
Tongue in cheek: we urgently need fusion power plants. For the AI and the helium.
The fact that I didn't know any of this is what is significant here. At some point I stopped caring about this sort of thing. It really doesn't matter any more. Don't get my wrong, I am as nerdy as they come. My first computer was a wire wrapped 8080 based system. That was followed by an also wire wrapped 8086 based system of my own design I used for day to day computing tasks (it ran Forth). If someone like me can get to the point of not caring there is no real reason for anyone else to care.
Running a VPS with Tailscale for private access, SQLite instead of managed databases, flat files synced with git instead of cloud storage. None of this requires expensive hardware, it just requires caring enough to set it up
Is that likely? History says it's inevitable, but timeframe is an open question.
The current AI-induced shortages aside, the times have never been better in my opinion. There is overwhelming choice; ordinary consumers can access anything from Raspberry PIs all the way up to enterprise servers and AI accelerators. The situation was very different in the 1990s when I built my first PC.
Think about it like this: Imagine the AI/Cloud/Crypto companies who are buying up all these compute and storage resources realize they now control the compute hardware market becoming compute lords. What happens when joe/jane six pack or company xyz needs a new PC or two thousand but cant afford them due to the supply crunch? Once the compute lords realize they control the compute supply they will move to rent you their compute trapping users in a walled garden. And the users wont care because they aren't computer enthusiasts like many of us here. They only need a tool that works. They *do not* care about the details.
They hardware lords could further this by building proprietary hardware in collusion with the vendors they have exclusivity with to build weaker terminal devices with just enough local ram and storage to connect to a remote compute cluster. Hardware shortage solved!
All they need to do is collude with the hardware makers with circular contracts to keep buying hardware in "anticipation of the AI driven cloud compute boom." The hardware demand cycle is kept up and consumers are purposefully kept out of the market to push people into walled gardens.
This is unsustainable of course and will eventually fall over but it could tie up computing resources for well over a decade as compute lords dry up the consumer hardware market pushing people to use their hoarded compute resources instead of owning your own. We are in a period where computing serfdom could be a likely outcome that could cause a lot of damage to freedom of use and hardware availability and the future ability to use the internet freely.
> For the better part of two decades, consumers lived in a golden age of tech. Memory got cheaper, storage increased in capacity and hardware got faster and absurdly affordable.
I got my first PC circa 1992 (a 2nd hand IBM PS/2, 80286 processor with 2MB RAM and 30MB HDD) and the "golden age" was already there. We are well over 40 years of almost uninterrupted "pay less for more performances" in the home/personal computing space, and that's because that space started around 50 years ago. There was some fluctuation (remember the earthquake affecting HDD prices a few years ago?) but demand was there and manufacturing tech became more efficient.
The actual important change is that for most consumer uses, the perf improvements stopped to make sense already what, over 10 years ago?
a couple of my favorites: "rust programming socks - Google", "Amazon.com: waifu pillow", "Rick Astley - Never Gonna Give You Up", "censorship on hacker news - Google"
I suspect we're trending back to the pre-personal computing era where access to 'raw' computing power will be hard to come by. It will become harder and harder to learn to program just because it'll be harder and harder to get your hands on the necessary equipment.
It will be scarcity mindset from here on out; will always buy the top tier thing .
Oh bubbles... their so bubbly. Remember when there was an unlimited demand for fibre optics because - The Internet? So Nortel and other manufacturers lent the money to their clients building the Internet because the growth was unlimited forever? Except they actually didn't have any money, just stock valuations?
"This is a critical step in our effort to unleash the full potential of our high-performance optical component solutions business," said Clarence Chandran, COO of Nortel Networks. "This acquisition really strengthens Nortel Networks' leadership position in high-performance optical components and modules which are essential to delivering the all-optical Internet."
The tipping point for MCUs was WiFi - which not only allows you to speak multiple protocols (UDP/Zigbee/HTTP/etc) and have audio IO, but also P2P communication and novel new form factors. There's been incredible progress with the miniaturisation of sensors and how we're able to understand and perceive our environment.
So yes, whilst traditional hardware is getting more expensive and locked down, there's a strong counter movement towards computing for everyone - and by that I also mean that there's going to be less abstraction in the entire stack. Good times ahead!
Everything today is a web app. If it doesn't exist and you want to vibe code it? It's probably going to become a web app, vibed using a web app.
The problem is, web apps are stupendous memory hogs. We're even seeing Chromebooks with 8 gigs of RAM now. LLM:s are all trained for and implemented in apps assuming the user can have $infinity browsers running, whether it's on their PC or on their phone. It's going to be very hard to change that in a way that's beneficial to what passes for business models at AI companies.
Ah, the paradoxes of modern software.